×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineer

Job in Greater London, London, Greater London, W1B, England, UK
Listing for: CoinShares
Full Time position
Listed on 2026-03-15
Job specializations:
  • Software Development
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 GBP Yearly GBP 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: Greater London

Our Culture

Coin Shares is an innovative, agile and ambitious organisation. We strive for excellence in everything we do. We are a high performance culture with a focus on:

  • Professional and personal integrity

  • Curiosity and a deep learning mindset

  • Transparency

  • Teamwork and collaboration

Coin Shares is strongly committed to diversity and inclusion and warmly welcomes candidates from all backgrounds.

The Team

Coin Shares deploys discretionary and systematic, computer-driven trading algorithms across digital assets, cryptocurrencies and derivatives. We have a proven and profitable track record in proprietary trading and are building and expanding our market-making and active investment strategies to complement our world-leading ETP & ETF business.

The Engineering team is responsible for all aspects of software development for the firm, including platform engineering, quant engineering, and ML and AI infrastructure and implementation. As part of a nimble team in a growing organisation, you will be collaborating and developing real time solutions with your colleagues on a constant basis.

Our technical stack runs in a microservices architecture with Golang and Python services deployed on AWS alongside a Java/React user interface. We connect with our proprietary platform, MATRIX, to 15+ trading venues managing hundreds of millions of messages and orders per day. You will continue to scale and improve this platform as crypto gains further prominence at the heart of the world financial ecosystem.

Role

Profile

Coin Shares is undertaking a strategic multi-year modernisation of its data and AI capabilities. As a Data Engineer, you will play a key hands‑on role in building the next‑generation data platform that supports our ETF/ETP operations, trading, research, marketing, corporate reporting, and future AI‑driven innovation.

This is an excellent opportunity for an engineer with strong foundations in Python, Airflow, and SQL/Postgres who wants to work on modern data infrastructure in a collaborative team environment.

You will contribute to designing and delivering scalable ingestion pipelines, analytics‑ready datasets, and reliable orchestration workflows on AWS.

This role is ideal for someone who enjoys building production‑grade data systems and wants to grow into broader platform ownership over time. The role is based in the City of London, with a hybrid working pattern of four days in the office and one day from home.

Responsibilities

Data Pipeline Development

  • Build and maintain ingestion pipelines from APIs, SaaS systems, and internal data sources.

  • Write clean, testable Python code to support ETL/ELT workflows.

  • Develop and optimise SQL transformations in Postgres and analytics layers.

Workflow Orchestration (Airflow)

  • Implement and monitor Airflow DAGs for scheduled data processing.

  • Troubleshoot pipeline failures and improve reliability and performance.

  • Contribute to orchestration best practices across the platform.

Data Modelling & Analytics Enablement

  • Help create analytics‑ready datasets for reporting, automation, and internal teams.

  • Support the development domain‑oriented schemas.

  • Work with stakeholders to ensure data is understandable and usable.

Data Quality & Observability

  • Implement data validation, monitoring, and alerting.

  • Support schema consistency, documentation, and governance standards.

  • Assist with improving auditability and reliability of published datasets.

Platform Collaboration

  • Work closely with senior engineers, product teams, and business users.

  • Contribute to the ongoing migration and modernisation of legacy systems.

  • Learn and grow within a modern AWS‑based data platform environment.

Skills & Qualifications

Essential

  • Minimum 2 years of experience in data engineering or related software engineering roles.

  • Strong skills in Python for building data pipelines and backend services.

  • Solid SQL experience, especially with Postgres.

  • Hands‑on experience with Airflow (or similar orchestration tools).

  • Understanding of ETL/ELT concepts and data warehouse/lakehouse patterns.

  • Ability to write maintainable, production‑quality code in a collaborative team.

  • Strong communication skills and willingness to learn from others.

Desirable

  • Familiarity…

Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary