BI Data Engineer
Publicado en 2026-02-01
-
TI/Tecnología
Ingeniero de datos, Analista de datos, Gerente de Ciencias de Datos, Almacenamiento de datos
As you contemplate your future, you might be asking yourself, what’s the next step? Start your journey with us!
We are seeking a talented and passionate
BI Data Engineer
to join our dynamic Business Intelligence team. In this role, you will be instrumental in shaping our data landscape, working on exciting projects including our ongoing migration to a cutting-edge Google Cloud-based data platform. If you thrive in an agile environment, love transforming raw data into actionable insights, and are proficient with modern data stack technologies, this is the opportunity for you.
Our Business Intelligence team is the centre of excellence for providing insights and learning from the Data. Our goal is to help different teams across the organization to succeed in their mission by providing insights generation and data analysis support in a healthy data-informed environment.
Why eDreams ODIGEOJoin the world’s leading travel subscription platform and one of the largest e-commerce businesses in Europe.
Millions of customers every year across 44 markets – 5 brands – over 7.6 million Prime members since launching in 2017.
More than 100 million searches per day on our websites – more than 6 billion AI daily predictions
Over 1,700 employees – More than 60 different nationalities from all continents – 99% permanent contracts
We’re a leading travel tech company, revolutionising the travel booking experience through our consumer insight, innovative technology, market leadership, and Prime, the world’s first travel subscription program.
What you will doThe Role’s Key Responsibilities and Tasks
As an eDOer, you will have clear objectives, great challenges and a clear overview of how your work contributes to the global company project and its customers. As an
BI Data Engineer in the Business Intelligence team
, you will be in charge of:
- Develop & Optimize Data Pipelines:
Design, build, and maintain efficient and reliable data pipelines using Python, SQL, DBT, and Airflow to ingest, transform, and load data from diverse sources into our Google Cloud data warehouse (Big Query). - Data Modelling & Architecture:
Contribute to the design and implementation of scalable and performant data models / pipelines that support analytical and reporting needs. - Ensure Data Quality & Integrity:
Implement data quality checks and processes to ensure the accuracy, consistency, and reliability of our BI data. - Collaborate & Innovate:
Work closely with product owners, data analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Actively participate in an agile development environment, contributing to sprint planning, reviews, and retrospectives. - Empower Data Users:
Support the development of dashboards and reports, and provide assistance to users to help them leverage data effectively. - Champion Best Practices:
Promote and implement best practices in data engineering, including code quality, testing, documentation, and version control.
Good to have
Bring your unique perspective, speak up, and offer disruptive solutions. You’ll have the opportunity to learn and grow while making a real impact on our team. Here’s what you need to succeed:
- Solid Data Engineering
Experience:
Proven experience in designing, building, and optimizing data pipelines. - Python Proficiency:
Strong programming skills in Python for data manipulation and automation. - SQL Expertise:
Deep understanding of SQL and experience with complex querying and data modelling. - DBT (Data Build Tool) Knowledge:
Hands‑on experience with DBT for transforming data in a modular and testable way. - Workflow Orchestration with Airflow:
Experience in developing and managing data workflows using Apache Airflow (preferably Google Cloud Composer). - Cloud Data Warehousing:
Familiarity with cloud-based data warehousing solutions, ideally Google Big Query. - ETL/ELT Tooling:
Understanding of ETL/ELT principles and experience with relevant tools. - Infrastructure Management and Automation:
Understanding usage of Google Kubernetes Engine (GKE), Docker, Pub/Sub, Git & CI/CD. - Analytical & Problem‑Solving Mindset:
Ability to analyze complex data challenges,…
Para buscar, ver y solicitar empleos que acepten solicitudes de su ubicación o país, toque aquí para realizar una búsqueda: