×
Register Here to Apply for Jobs or Post Jobs. X

Enterprise Data Transformation Lead IRC

Job in Greater London, London, Greater London, W1B, England, UK
Listing for: GlobalLogic
Full Time position
Listed on 2026-03-04
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 GBP Yearly GBP 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Position: Enterprise Data Transformation Lead IRC289749
Location: Greater London

Overview

Designation:
Director

Function:
Engineering

Experience:

10-15 years

Location:

United Kingdom - London

Skills:

AI, AWS, Databricks, GCP, Snowflake, Teradata

You will be the architect and orchestrator of the migration journey. This is a high-impact leadership role with «single point of accountability» for transitioning a legacy data ecosystem to a modern cloud native data architecture.

The individual is responsible for the end-to-end technical vision, from the initial discovery and architectural design to the final decommissioning of legacy hardware. They must own the migration execution strategy, ensuring that the plan is not only technically sound but also aligns with the bank’s rigorous risk, compliance, and continuity standards. Beyond technical delivery, they act as the primary bridge to executive stakeholders, translating complex technical roadmaps into business value and managing the expectations of global department heads.

They are expected to lead, mentor, and drive a high-performing team of engineers and architects to execute a seamless cutover in a zero‑downtime environment.

Requirements
  • Experience leading large scale data platform modernisation programs; 5+ years in Cloud‑specific data platform migrations.
  • Proven ability to manage $10M+ transformation budgets and large cross‑functional teams.
  • Cloud Data Architecture:
    Proven hands‑on experience designing and implementing cloud‑native data platforms, with deep expertise in AWS, GCP, or Azure. Must be proficient in modern services such as Amazon EMR, Redshift, Athena, Glue, Lambda, Big Query, and Google Dataflow.
  • Expert Level experience with cloud data warehouses (e.g., Snowflake, Databricks, Teradata) is highly desirable.
  • Modern Data Engineering Frameworks:
    Advanced proficiency in big data ecosystems—Hadoop, Spark (including PySpark), Flink, and contemporary MLOps or Data Ops tooling (Airflow, dbt, Prefect).
  • Database Expertise:
    Strong command of both traditional (SQL Server, Oracle, Postgre

    SQL, MySQL) and modern No

    SQL platforms (Mongo

    DB, Cassandra, Dynamo

    DB, Cosmos

    DB).
  • Data Visualisation & Analytics:
    Familiarity with leading BI platforms (Power

    BI, Tableau, Qlik). Experience integrating with embedded analytics and self‑service data models is a plus.
  • Programming & Automation:
    Strong coding skills in Python, with hands‑on experience in R and/or Scala. Proficiency in infrastructure‑as‑code using Terraform, Cloud Formation, or Pulumi. Experience integrating CI/CD pipelines and deploying containerised solutions (Docker, Kubernetes).
  • Agile & Architecture Standards:
    Experience working within Agile and Dev Ops environments. Skilled in applying architectural frameworks such as TOGAF, Zachman, or the Data Management Body of Knowledge (DMBOK).
  • Data Modelling & Pipeline Design:
    Expertise in conceptual/logical/physical data modelling, metadata management, building robust ETL/ELT pipelines, event streaming (Kafka, Kinesis, Pub/Sub), and supporting cloud and hybrid data migrations.
  • Data Governance & Compliance:
    Deep understanding of data protection, privacy, and governance frameworks (GDPR, CCPA, ISO 27001, DAMA). Experience implementing data quality, auditing, cataloging (Data Hub, Collibra, Alation, Open Metadata), and master data management capabilities.
  • Emerging Technology Awareness:
    Familiarity or hands‑on exposure to AI/ML integration, knowledge graphs, data mesh/data fabric architectures, and observability tooling (e.g., Monte Carlo, Data Dog, Open Lineage).
  • Deep understanding of the «Global Banking» regulatory landscape.
  • Financial Security Standards:
    Ensuring the solution adheres to global banking standards, including SOX, GDPR, and BCBS 239.
  • Resilience & Recovery:
    Ownership of the Disaster Recovery (DR) and High Availability (HA) design, ensuring the new AWS platform meets or exceeds the bank’s RTO/RPO mandates.
  • Audit Readiness:
    Maintaining comprehensive documentation of the design, testing results, and migration logs to provide a clear trail for regulatory examinations.
  • Strong Communication & Leadership:
    Excellent stakeholder engagement, requirements gathering, and technical leadership skills. Proven ability to bridge business needs and technical…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary