Data Platform Engineer
Listed on 2025-12-11
-
IT/Tech
Data Engineer, Cloud Computing
Do you want to make a difference and contribute to creating a better world? Are you interested in developing your skills and knowledge, while putting your ideals to work? Join us, we are a frontrunner in ethical banking and global impact investing, and we need people like you to make change happen.
We believe that money can change the world for the better. In fact, our organisation was founded on this belief. Our mission is to create a society that protects and promotes the quality of life of all its members with human dignity at its core.
To further sustain data-driven decision-making and future-proof our organisation, we are evolving towards a modern data platform while managing a complex migration. Building the foundation for analytics, machine learning and AI. You make it possible as,
ProfielHow you will make a difference
As a Data Platform Engineer, you will design, build, operate and optimise our Databricks-based data platform on Microsoft Azure. You’ll collaborate with diverse roles such as Data Engineers, Data Analysts, Data Scientists, Data Product Owners, Data Visualisation Experts and Platform & Infrastructure teams to deliver reliable, secure and high-performing data solutions.
One of our major challenges in the coming years will also be migrating our current Enterprise Data Warehouse to this new platform. You will help develop solutions for questions like:
- How can we ensure scalable and cost-efficient data processing for (AI) workloads?
- How do we implement secure and compliant data governance across the platform?
- Which solutions provide the best data-driven value and experience for our users?
We offer a range of challenges and opportunities for you to make a meaningful contribution and shape our data landscape within a dynamic team.
What you will bring
- Design, implement and maintain scalable data ingestion, transformation and serving pipelines using Azure Databricks and related Azure services
- Operate and optimise Databricks work spaces and Spark clusters: cluster sizing, autoscaling, job scheduling, performance tuning and cost control
- Implement data governance, access control and cataloguing (Unity Catalog), ensuring compliance and secure data access through Infrastructure-as-Code
- Build and maintain CI/CD pipelines and automated testing for data platform components
- Create observability and monitoring for jobs and data quality; implement alerting and incident response practices
- Collaborate with our Data Architect, Data Engineers and Data Scientists to evolve data models and support analytical and ML workloads
- Contribute to migrating our current on-premise Data Platform to our new
Your future colleagues
For our new Data Platform Engineer, a (partially virtual) desk is available at our Data Office team (part of Data & Analytics). The team consists of 15 data professionals, divided into three teams focusing on specific organisational topics. We work together in an informal and transparent manner, with lots of room for initiative and development. We offer a variety of (personal) training and the opportunity to experiment in a safe environment.
What you will bring
While you are optimising a Spark cluster for a machine learning workload, a colleague asks you to review a CI/CD pipeline for a new data ingestion process. You can easily switch between tasks and thrive in this dynamic environment without losing focus. You know the best results are achieved together and work closely with Data Product Owners, Data Analysts, Data Visualisation Experts, Data Platform Engineers and other stakeholders.
Thanks to your clear communication, you and your colleagues quickly get to the core of every challenge. This is complemented by:
- 3+ years in data engineering roles with hands-on experience building production data pipelines
- 2+ years operating Azure Databricks in production (notebooks, jobs, cluster policies, Delta Lake), in combination with Unity Catalog
- Strong programming skills in Python/PySpark and expert-level SQL. Practical knowledge of CI/CD (Azure Dev Ops/Git), IaC (Terraform)
- Understanding of Enterprise Data Warehouse and Data Vault modelling concepts
- Familiarity with data governance, security best practices and implementing fine-grained access controls
- Affinity with Agile/Scrum way of working and Big Data initiatives
Beside that:
- You share our mission to create and realise positive impact
- Have a bachelor or master degree (work/thinking level)
- Proactive and communicative
What we offer you
An inspiring work culture
The most important thing about working with us is that you are part of community that is changing the world for the better. We are constantly improving our work culture to create conditions where every person can thrive, so you are able to:
- find a sense of meaning in your work
- create positive energy and impact with your co-workers
- build mutual trust and respect in working relationships
- enjoy fulfilment at a professional and personal level
A comfortable working environment
Our award-winning sustainable head office is based in…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: