×
Register Here to Apply for Jobs or Post Jobs. X

Software Engineer, Data Products

Job in City Of London, Central London, Greater London, England, UK
Listing for: Yapily Ltd
Full Time position
Listed on 2026-01-06
Job specializations:
  • IT/Tech
    Data Engineer, Data Security
Salary/Wage Range or Industry Benchmark: 80000 - 100000 GBP Yearly GBP 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Location: City Of London

Overview

Who are Yapily

Yapily is on a mission to enable innovative companies to create better and fairer financial services for everyone, through the power of open banking.

Yapily is an open banking infrastructure platform solving a fundamental problem in financial services today: access. Historically, card networks have monopolised the global movement of money, and banks have monopolised the ownership of, and access to, financial data.

Yapily was founded to challenge these structures and create a global open economy that works for everyone. We exist behind the scenes, securely connecting companies - from growth to enterprise - to thousands of banks worldwide, enabling them to access data and initiate payments through the power of open banking.

What we’re looking for

As a Java Software Engineer focused on Data Products at Yapily, you will play a key role in designing and implementing our next-generation data systems. You’ll be responsible for developing high-performance data pipelines, billing infrastructure and APIs that power our suite of products – including Reports API, Analytics API and Insights API – ensuring data is reliably processed and securely delivered to our customers.

  • Develop & Optimize Data Pipelines:
    Design, build, and maintain scalable data ingestion and processing systems to transform raw data into actionable insights.
  • Billing Infrastructure:
    Build and maintain a reliable billing architecture within an event driven environment.
  • Data Products:
    Design, develop, and maintain APIs that deliver a seamless data experience for our customers.
  • Database Management:
    Work with both SQL and No

    SQL databases, optimizing schema designs and queries to support high-volume data transactions.
  • Collaborative Problem-Solving:
    Work closely with BI, infrastructure teams, product managers, and cross-functional teams to deliver data-centric solutions that drive business value.
  • Quality Assurance:
    Implement robust testing, monitoring, and logging practices to ensure the performance and resilience of data systems.
  • Continuous Improvement:
    Engage in code reviews, iterative development, and agile methodologies to continuously enhance product functionality and reliability.
Qualifications

Essential Skills

  • 5+ years of hands-on Java development experience in a data-intensive environment.
  • Proven experience building and maintaining data pipelines and APIs.
  • Strong background in database management, including both SQL/ No

    SQL databases.
  • Experience designing, implementing, and optimizing ETL/ELT processes for high-volume data environments (millions of requests per day).
  • Demonstrated expertise in data modeling and schema design for both operational and analytical systems.
  • Experience with data validation, data cleaning, and ensuring data quality throughout the pipeline.
  • Proficiency working with REST APIs and microservices architectures.
  • Knowledge of stream processing frameworks for real-time data processing.
  • Experience with cloud-based data services, particularly on Google Cloud Platform, advantageous
  • Familiarity with data orchestration tools and workflow management systems.
  • Experience implementing data governance and compliance measures in line with regulations like GDPR and standards like ISO
    27001.
  • Background in SaaS, API, or telecommunications environments, with specific expertise in billing systems and usage-based data processing.
  • Experience supporting BI tools and data visualization platforms, particularly Looker.
  • Knowledge of version control and CI/CD practices for data pipeline deployment.
  • Experience monitoring and troubleshooting data pipelines in production environments.
  • Understanding of data security best practices and encryption methods for sensitive data.
  • Ability to optimiwe data systems for performance, cost, and scalability.

Preferred Skills

  • Experience with Python for data processing and automation tasks.
  • Knowledge of containerisation technologies (Docker, Kubernetes).
  • Experience with IaC (Infrastructure as Code) tools like Terraform.
  • Familiarity with event-driven architectures.
  • Experience implementing data lineage and metadata management solutions.
  • Background in implementing data models for subscription billing, usage-based pricing, or…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary