×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer; Modern Data Platform & AI Berlin, hybrid or remote

Remote / Online - Candidates ideally in
Germany, Pike County, Ohio, USA
Listing for: Aroundhome
Full Time, Remote/Work from Home position
Listed on 2026-01-05
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below
Position: Senior Data Engineer (Modern Data Platform & AI) (all genders) | Berlin, hybrid or remote
Location: Germany

Senior Data Engineer (Modern Data Platform & AI) | Berlin, hybrid or remote

We at Aroundhome empower 15.6 million house owners across Germany and the broader DACH region. Join our team of ~250 people and shape a data‑centric platform that informs every strategic decision.

You will report to the Data Platform Team Lead and collaborate with Product, Marketing, Finance, and Product Analytics to deliver a reliable, scalable, and accessible data platform company‑wide.

Responsibilities
  • Design, develop, and maintain modern data models and transformational pipelines to support analytics, reporting, and AI use cases.
  • Build and optimize real‑time data pipelines using Kafka, Spark, and Delta Live Tables.
  • Architect a scalable data lakehouse or warehouse, bridging upstream (Debezium, MSK) to downstream consumers.
  • Optimize ETL/ELT pipelines with DBT, Spark, Airflow and integrate new technologies to support hybrid architecture and AI/ML enablement.
  • Establish data governance frameworks, including lineage, quality, catalog, metadata, and SSOT for business glossaries.
  • Implement end‑to‑end testing and Data Life‑Cycle Management practices.
  • Mentor the team on best practices, modern tools (Databricks, Snowflake, AI adapters) and foster a culture of innovation and continuous improvement.
Qualifications
  • Master’s degree in Computer Science, Data Engineering, or related field (or equivalent experience).
  • 10+ years of data engineering experience, including 5+ years in senior roles focused on modern architectures.
  • Proven expertise in designing, developing and maintaining data lake houses/DWH (Databricks Delta Lake, Snowflake), and transformations (DBT, SQL/Python/Spark).
  • Strong experience with AWS services (S3, Athena, MSK/Kafka, Terraform) and real‑time streaming (Kafka, Spark Structured Streaming, Flink).
  • Hands‑on knowledge of data governance tools (Unity Catalog, Collibra) for lineage, quality, catalogs, and SSOT.
  • Familiar with AI/ML pipelines and MLOps (MLflow, feature stores) and complex system integration.
  • Proficiency in CI/CD for data, using Git, Airflow, dbt Cloud, etc.
  • Excellent communication, collaboration and mentoring skills.
Benefits
  • Hybrid work: flexible office in Berlin, home in Germany, or fully remote from Portugal; up to 30 days per year remote from selected EU countries.
  • Personal development: regular feedback, monthly tech talks, annual training budget, Linked In Learning access.
  • Well‑being: discounted Urban Sports Club membership; partnership with “Fürstenberg Institute” for mental health & coaching.
  • ESG and social projects: ESG Team initiatives, Job Rad, BVG ticket discount, annual Social Day.
  • Community: monthly Company Day at Potsdamer Platz; comprehensive C‑Board updates.
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary