Software Engineer, Data Products
Listed on 2026-01-06
-
IT/Tech
Data Engineer, Data Security
Overview
Who are Yapily
Yapily is on a mission to enable innovative companies to create better and fairer financial services for everyone, through the power of open banking.
Yapily is an open banking infrastructure platform solving a fundamental problem in financial services today: access. Historically, card networks have monopolised the global movement of money, and banks have monopolised the ownership of, and access to, financial data.
Yapily was founded to challenge these structures and create a global open economy that works for everyone. We exist behind the scenes, securely connecting companies - from growth to enterprise - to thousands of banks worldwide, enabling them to access data and initiate payments through the power of open banking.
What we’re looking forAs a Java Software Engineer focused on Data Products at Yapily, you will play a key role in designing and implementing our next-generation data systems. You’ll be responsible for developing high-performance data pipelines, billing infrastructure and APIs that power our suite of products – including Reports API, Analytics API and Insights API – ensuring data is reliably processed and securely delivered to our customers.
- Develop & Optimize Data Pipelines:
Design, build, and maintain scalable data ingestion and processing systems to transform raw data into actionable insights. - Billing Infrastructure:
Build and maintain a reliable billing architecture within an event driven environment. - Data Products:
Design, develop, and maintain APIs that deliver a seamless data experience for our customers. - Database Management:
Work with both SQL and No
SQL databases, optimizing schema designs and queries to support high-volume data transactions. - Collaborative Problem-Solving:
Work closely with BI, infrastructure teams, product managers, and cross-functional teams to deliver data-centric solutions that drive business value. - Quality Assurance:
Implement robust testing, monitoring, and logging practices to ensure the performance and resilience of data systems. - Continuous Improvement:
Engage in code reviews, iterative development, and agile methodologies to continuously enhance product functionality and reliability.
Essential Skills
- 5+ years of hands-on Java development experience in a data-intensive environment.
- Proven experience building and maintaining data pipelines and APIs.
- Strong background in database management, including both SQL/ No
SQL databases. - Experience designing, implementing, and optimizing ETL/ELT processes for high-volume data environments (millions of requests per day).
- Demonstrated expertise in data modeling and schema design for both operational and analytical systems.
- Experience with data validation, data cleaning, and ensuring data quality throughout the pipeline.
- Proficiency working with REST APIs and microservices architectures.
- Knowledge of stream processing frameworks for real-time data processing.
- Experience with cloud-based data services, particularly on Google Cloud Platform, advantageous
- Familiarity with data orchestration tools and workflow management systems.
- Experience implementing data governance and compliance measures in line with regulations like GDPR and standards like ISO
27001. - Background in SaaS, API, or telecommunications environments, with specific expertise in billing systems and usage-based data processing.
- Experience supporting BI tools and data visualization platforms, particularly Looker.
- Knowledge of version control and CI/CD practices for data pipeline deployment.
- Experience monitoring and troubleshooting data pipelines in production environments.
- Understanding of data security best practices and encryption methods for sensitive data.
- Ability to optimiwe data systems for performance, cost, and scalability.
Preferred Skills
- Experience with Python for data processing and automation tasks.
- Knowledge of containerisation technologies (Docker, Kubernetes).
- Experience with IaC (Infrastructure as Code) tools like Terraform.
- Familiarity with event-driven architectures.
- Experience implementing data lineage and metadata management solutions.
- Background in implementing data models for subscription billing, usage-based pricing, or…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: