Associate Data Platform Engineer
Listed on 2026-02-14
-
IT/Tech
Data Engineer, Data Analyst
Data is at the heart of what we do.
We are looking for a Data Platform Engineer to join our Data Infrastructure team, and help us build our data platform for analytics, machine learning, marketing and much more.
You will work closely with people from a wide variety of domains within Depop, as well as our Insights, Analytics Engineers, Data Scientists, MLOps, Mar Tech and other Data Engineering teams. You will help manage our growing information needs and support increasingly complex business problems by building and promoting self‑service tools and data best practices that will be used across the organisation, including taking ownership of our data transformation and orchestration tooling and batch infrastructure and exploration tools (Databricks, Airflow, dbt) and look after our Datalake (ingestion, storage, governance, privacy).
We're building scalable and robust systems to harvest, process and analyse the vast data within our tech ecosystem. With an increasing demand to service other areas of the business, and ultimately our users, you'll be at the forefront of pioneering Data-as-a-Service.
Want to find out more about Depop & our engineering team? We write about technology, people and smart engineering right here -
Responsibilities- Play an integral role in owning initiatives for our Data Platform - working closely with our data scientists, analysts, analytics engineers and other engineers to support their deployment speed and productivity needs with self‑serve data transformation and processing tools (dbt, Databricks, Airflow).
- Successful end-to-end delivery of your team's projects; from scoping and translating business requirements into plans, to design, implementation and maintenance, whilst coordinating with other teams (technical and non-technical users).
- Proactively identify ways to improve data processes, discovery and ownership, navigating complex challenges as our data grows and becomes an integral piece of our business and product operations.
- Embrace agile methodologies
- Engage in a culture of continuous improvement by attending events such as blameless post-mortems, architecture reviews etc.
- Engage in health and performance improvements of our data platform and work towards promoting company‑wide best practices to allow for their scalable growth by striving for automation, writing clear documentation, tutorials and hosting training sessions.
- Hold high standards for operational excellence; from running your own services to testing, monitoring, maintenance and reacting to production issues.
- Adding to a strong engineering culture, orientated on technical innovation, and professional development.
- A strong sense of ownership, autonomy and a highly organised nature.
- Excellent written and spoken English communication skills
- Comfortable working in a fast‑paced environment and able to respond to change or uncertainty with a positive attitude and a willingness to learn.
- Familiarity with a high‑level programming language (e.g. Python, Scala).
- You have had some experience using version control such as Git, or similar.
- Passionate about working on a self‑service data platform and playing an integral role in designing and creating tools to increase user productivity and velocity across our data organisation.
- You have a passion for learning new things and keeping on top of the latest developments and technologies in our field. We take pride in our learning and make sure to have dedicated time set aside for our growth and development (we offer personal development time and other platforms to share knowledge with your peers!).
- Manage the infrastructure and integrate workflow management and data processing tools such as Airflow, Databricks, dbt or similar.
- Become an expert in managing datalake ingestion platforms, focusing on optimising and monitoring the ingestion flows, compute, storage, governance, privacy and more.
- You'll upskill in the data domain - working closely to enable advanced data users; data scientists, analysts or analytics engineers and have a good grasp of their needs and how they operate.
- Python/Scala
- Dev Ops methodologies - building CI/CD pipelines (Jenkins), IaC…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: