More jobs:
Data Platform Engineer
Job in
City Of London, Central London, Greater London, England, UK
Listed on 2025-12-31
Listing for:
McCabe & Barton
Full Time
position Listed on 2025-12-31
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Science Manager
Job Description & How to Apply Below
Data Platform Engineer – Permanent
Hybrid (3 days in the office, 2 days WFH)
London
McCabe & Barton are partnering with a leading financial services client to recruit an experienced Data Platform Engineer. This is an excellent opportunity to join a forward-thinking team driving innovation with modern cloud-based data technologies.
Role OverviewAs a Data Platform Engineer, you will design, build, and maintain scalable cloud-based data infrastructure using Azure and Databricks. You’ll play a key role in ensuring that data pipelines, architecture, and analytics environments are reliable, performant, and secure.
Key Responsibilities- Design and implement data pipelines using Azure Data Factory, Databricks, and related Azure services.
- Build ETL / ELT processes to transform raw data into structured, analytics-ready formats.
- Optimise pipeline performance and ensure high availability of data services.
- Design and implement data pipelines using Azure Data Factory, Databricks, and related Azure services.
- Build ETL / ELT processes to transform raw data into structured, analytics-ready formats.
- Optimise pipeline performance and ensure high availability of data services.
- Architect and deploy scalable data lake solutions using Azure Data Lake Storage.
- Implement governance and security measures across the platform.
- Leverage Terraform or similar IaC tools for controlled and reproducible deployments.
- Develop and optimise data jobs using PySpark or Scala within Databricks.
- Implement the medallion architecture (bronze, silver, gold layers) and use Delta Lake for reliable data transactions.
- Manage cluster configurations and CI / CD pipelines for Databricks deployments.
- Implement monitoring solutions using Azure Monitor, Log Analytics, and Databricks tools.
- Optimise performance, ensure SLAs are met, and establish disaster recovery and backup strategies.
- Partner with data scientists, analysts, and business stakeholders to deliver effective solutions.
- Document technical designs, data flows, and operational procedures for knowledge sharing.
- 5+ years of experience with Azure services (Azure Data Factory, ADLS, Azure SQL Database, Synapse Analytics).
- Strong hands‑on expertise in Databricks, Delta Lake, and cluster management.
- Proficiency in SQL and Python for pipeline development.
- Familiarity with Git / Git Hub and CI / CD practices.
- Understanding of data modelling, data governance, and security principles.
- Experience with Terraform or other Infrastructure-as-Code tools.
- Familiarity with Azure Dev Ops or similar CI / CD platforms.
- Experience with data quality frameworks and testing.
- Azure Data Engineer or Databricks certifications.
Please apply with an updated CV if you align to the key skills required!
#J-18808-LjbffrNote that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×