Senior Data Engineer
Listed on 2026-02-10
-
Engineering
Data Engineer
Overview
Salary: Circa £95,000 base + Bonus + Bens (Roughly £130k all-in)
Location: Hybrid - 3 days per week in Paddington
The CompanyWeDo are working with a high growth fintech operating regulated payment platforms across multiple countries. As the business continues to expand internationally, data engineering has become a core strategic function, with significant investment in scalable, reliable, and automated data platforms.
The PositionThis is a Senior Data Engineering role within the Regulatory Reporting function, focused on building a new data platform to support international growth.
The team currently delivers regulatory reporting for six countries and is scaling to more than thirty. Each country has different reporting rules, formats, and submission processes, creating a complex and highly impactful engineering challenge.
You will play a key role in designing and building this platform from the ground up, owning data models, pipelines, orchestration, and submission workflows. You will work closely with the core Data Engineering platform team while retaining clear ownership of the regulatory reporting layer.
The role is strongly engineering focused rather than BI oriented. You will work with both batch and real time data, integrate with microservices, and help transition the platform towards event driven architecture. Accuracy and reliability are critical, as these outputs are submitted directly to regulators.
This is an excellent opportunity for a senior data engineer looking for ownership, architectural influence, and a platform that will grow significantly in scale and complexity.
Responsibilities- Own data models, pipelines, orchestration, and submission workflows for the regulatory reporting platform.
- Design and build the data platform from the ground up to support international growth.
- Collaborate with the core Data Engineering platform team while retaining ownership of the regulatory reporting layer.
- Work with both batch and real time data; integrate with microservices; assist in transitioning to event driven architecture.
- Focus on accuracy and reliability, as outputs are submitted directly to regulators.
- Experience with AWS
- Kafka (Kinesis; ideally Flink)
- RDS/Redshift
- Python scripting
- dbt/Airflow
- IaC/Kubernetes - Nice to have
3 stages with small take home test involved
Interested?Apply for the position or send your CV to
#J-18808-LjbffrTo Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: