×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Principal Engineer - Temporary

Job in Greater London, London, Greater London, W1B, England, UK
Listing for: Schroders
Seasonal/Temporary, Contract position
Listed on 2026-02-27
Job specializations:
  • Engineering
    Data Engineer
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 100000 - 125000 GBP Yearly GBP 100000.00 125000.00 YEAR
Job Description & How to Apply Below
Position: Principal Engineer - Temporary Contract
Location: Greater London

A senior data engineer with extensive experience building robust, scalable data pipelines, ideally within a cloud-native environment. You will already be an established engineer with strong SQL and Python skills, capable of mentoring more junior team members and helping the Engineering Lead build reusable patterns and high‑quality engineering templates for use across Schroders.

Because our modern data platform is centred on Snowflake, deep hands‑on Snowflake engineering experience is essential. You should be comfortable designing, implementing and optimising Snowflake-based solutions including:

  • Snowflake best practices for performance, scalability and cost control
  • Designing Snowflake schemas, optimising micro-partitioning & clustering
  • Building Snowflake tasks, streams, pipes and materialised views
  • Implementing Snowflake security, RBAC, masking and governance patterns
  • Integrating Snowflake with orchestration and ingestion tools
  • Using Snowflake native features such as Snowpark and stored procedures

Snowflake certifications are a strong advantage — especially Snow Pro Core and Snow Pro Advanced (Data Engineer).

Experience with Airflow, DBT and Docker/Kubernetes is highly valuable, particularly where you’ve used these tools in conjunction with Snowflake to build data ingestion pipelines or support data warehousing use cases.

You’ll need to quickly adopt new technologies to meet evolving business needs, while maintaining high-quality, well‑documented code. As a senior engineer, you will have a solid, practical understanding of data modelling and ideally a background in computer science or engineering, although relevant industry experience will also be considered.

The Team

Enterprise Data Engineering sits within Global Technology. Our recent focus has been helping Schroders embed ESG data throughout our investment processes as part of the Global Sustainability initiative. While that work continues, we are expanding and centralising our engineering capabilities to support the entire organisation in 2025, with Snowflake increasingly at the centre of our data strategy.

What you’ll do

You will work with the team to deliver data capabilities—data sources, tools, and data products—across a wide range of stakeholders and technologies.

This is a hands‑on engineering role with a heavy focus on building, deploying and optimising Snowflake-based solutions. You will collaborate closely with business SMEs and technology partners, supporting both requirements gathering and technical delivery.

Working in agile backlog‑focused squads, you’ll contribute to the analysis, design, implementation and testing of data pipelines, with a particular emphasis on:

  • Snowflake-centric ingestion and transformation pipelines
  • Snowflake performance optimisation and cost-efficient design
  • Implementing DBT or SQL‑based transformation layers on Snowflake
  • Ope rationalising Snowflake for reliability, observability and monitoring

We value continuous learning, innovation and experimentation, and are looking to extend Snowflake capabilities with AI/ML in the future, including Snowpark and native integrations.

The knowledge, experience and qualifications you need

  • Solid Python skills used in data engineering in a commercial environment
  • Excellent SQL / SnowSQL expertise including advanced optimisation techniques
  • Strong knowledge of Snowflake features including tasks, streams, pipes, micro-partitions, Snowpark, RBAC, warehouses and performance tuning
  • Practical understanding of profiling SQL and managing performance trade-offs in Snowflake
  • Good understanding of agile methodologies, with experience contributing to scrum ceremonies
  • Experience with cloud technologies, ideally Azure and AWS, plus Docker/Kubernetes
  • Experience with data quality tooling or implementing a data quality framework
  • Deep knowledge of building cloud‑native data pipelines with strong failure‑handling patterns
  • Excellent understanding of ETL/ELT patterns, idempotency and best practices
  • Strong data modelling skills (3NF, Star Schemas, wide/tall projections), particularly as applied in Snowflake
  • Good knowledge of Git Hub and collaborative code development
  • Knowledge of testing frameworks for data pipelines,…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary