×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer; AI​/ML

Remote / Online - Candidates ideally in
City Of London, Central London, Greater London, England, UK
Listing for: Satalia
Remote/Work from Home position
Listed on 2025-12-30
Job specializations:
  • Software Development
    Data Engineer, AI Engineer
Job Description & How to Apply Below
Position: Data Engineer (AI / ML)
Location: City Of London

Join to apply for the Data Engineer (AI / ML) role at Satalia
.

Satalia, a WPP company, pushes the boundaries of data science, optimisation and artificial intelligence to solve complex industry problems. We are a community of individuals devoted to diverse projects, allowing you to flex your technical skills while working with a tight‑knit team of high‑performing colleagues.

Role Overview

Data Engineer (AI / ML). Permanent.

Location:

UK or Greece. Preferred start date: ASAP.

THE ROLE

We are investing massively in developing next‑generation AI tools for multimodal datasets and a wide range of applications. We build large‑scale, enterprise‑grade solutions for FTSE 100 clients. As a member of our team, you will collaborate with world‑class talent and shape cutting‑edge AI products and services.

Your Responsibilities
  • Collaborate closely with data scientists, architects, and other stakeholders to understand and break down business requirements.
  • Collaborate on schema design, data contracts, and architecture decisions, ensuring alignment with AI/ML needs.
  • Provide data engineering support for AI model development and deployment, ensuring data scientists have access to the data they need in the format they need it.
  • Leverage cloud‑native tools (GCP / AWS / Azure) for orchestrating data pipelines, AI inference workloads, and scalable data services.
  • Develop and maintain APIs for data services and serving model predictions.
  • Support the development, evaluation and productionisation of agentic systems with:
    • LLM‑powered features and prompt engineering
    • Retrieval‑Augmented Generation (RAG) pipelines
    • Multimodal vector embeddings and vector stores
    • Agent development frameworks: ADK, Lang Graph, Autogen
    • Model Context Protocol (MCP) for integrating agents with tools, data and AI services
    • Google's Agent2

      Agent (A2A) protocol for communication and collaboration between different AI agents
  • Implement and optimise data transformations and ETL/ELT processes, using appropriate data engineering tools.
  • Work with a variety of databases and data warehousing solutions to store and retrieve data efficiently.
  • Implement monitoring, troubleshooting, and maintenance procedures for data pipelines to ensure the high quality of data and optimise performance.
  • Participate in the creation and ongoing maintenance of documentation, including data flow diagrams, architecture diagrams, data dictionaries, data catalogues, and process documentation.
Minimum Qualifications / Skills
  • High proficiency in Python and SQL.
  • Strong knowledge of data structures, data modelling, and database operation.
  • Proven hands‑on experience building and deploying data solutions on a major cloud platform (AWS, GCP, or Azure).
  • Familiarity with containerisation technologies such as Docker and Kubernetes.
  • Familiarity with Retrieval‑Augmented Generation (RAG) applications and modern AI/LLM frameworks (e.g., Lang Chain, Haystack, Google GenAI, etc.).
  • Demonstrable experience designing, implementing, and optimizing robust data pipelines for performance, reliability, and cost‑effectiveness in a cloud‑native environment.
  • Experience in supporting data science workloads and working with both structured and unstructured data.
  • Experience working with both relational (e.g., Postgre

    SQL, MySQL) and No

    SQL databases.
  • Experience with a big data processing framework (e.g., Spark).
Preferred Qualifications / Skills
  • API Development:
    Experience building and deploying scalable and secure API services using a framework like FastAPI, Flask, or similar.
  • Experience partnering with data scientists to automate pipelines for model training, evaluation, and inference, contributing to a robust MLOps cycle.
  • Hands‑on experience designing, building, evaluating, and product ionising RAG systems and agentic AI workflows.
  • Hands‑on experience with vector databases (e.g., Pinecone, Weaviate, Chroma

    DB).
WE OFFER
  • Benefits – enhanced pension, life assurance, income protection, private healthcare.
  • Remote working – cafe, bedroom, beach – wherever works.
  • Truly flexible working hours – school pick‑up, volunteering, gym.
  • Generous Leave – 27 days holiday plus bank holidays and enhanced family leave.
  • Annual bonus – when Satalia does well, we all do well.
  • Impac…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary