×
Register Here to Apply for Jobs or Post Jobs. X

Lead PySpark Engineer - Data, SAS, AWS

Job in London, Greater London, W1B, England, UK
Listing for: Randstad Technologies Recruitment
Full Time position
Listed on 2026-02-11
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Big Data
Salary/Wage Range or Industry Benchmark: 380 GBP Daily GBP 380.00 DAY
Job Description & How to Apply Below
Lead Data Engineer - Pyspark / AWS / Python / SAS - Financial Sector

As a Lead PySpark Engineer, you will design, develop, and fix complex data processing solutions using PySpark on AWS. You will work hands-on with code, modernising legacy data workflows and supporting large-scale SAS-to-PySpark migrations. The role requires strong engineering discipline, deep data understanding, and the ability to deliver production-ready data pipelines in a financial services environment.

Essential Skills

PySpark & Data Engineering

Minimum 5+ years of hands-on PySpark experience.
SAS to Pyspark migration experience
Proven ability to write production-ready PySpark code.
Strong understanding of data and data warehousing concepts, including: ETL/ELT, Data models, Dimensions and facts, Data marts, SCDsSpark Performance & Optimisation

Strong knowledge of Spark execution concepts, including partitioning, optimisation, and performance tuning.
Experience troubleshooting and improving distributed data processing pipelines.
Python & Engineering Quality
Strong Python coding skills with the ability to refactor, optimise, and stabilise existing codebases.
Experience implementing parameterisation, configuration, logging, exception handling, and modular design.

SAS & Legacy Analytics

Strong foundation in SAS (Base SAS, SAS Macros, SAS DI Studio).
Experience understanding, debugging, and modernising legacy SAS code.

Data Engineering & Testing

Ability to understand end-to-end data flows, integrations, orchestration, and CDC.
Experience writing and executing data and ETL test cases.
Ability to build unit tests, comparative testing, and validate data pipelines.

Engineering Practices

Proficiency in Git-based workflows, branching strategies, pull requests, and code reviews.
Ability to document code, data flows, and technical decisions clearly.
Exposure to CI/CD pipelines for data engineering workloads.

AWS & Platform Skills

Strong understanding of core AWS services, including: S3, EMR / Glue, Workflows, Athena, IAM
Experience building and operating data pipelines on AWS.
Big data processing on cloud platforms.

Desirable Skills

Experience in banking or financial services.
Experience working on SAS modernisation or cloud migration programmes.
Familiarity with Dev Ops practices and tools.
Experience working in Agile/Scrum delivery environments.

I have three roles available all of which can be worked remotely so dont delay and apply today. I have interview slots ready to be filled

Randstad Technologies is acting as an Employment Business in relation to this vacancy
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary