DVT is one of the leading software development consultancies on the African continent, partnering with top organisations across South Africa and globally to deliver cutting‑edge technology solutions. Our engineers consult on complex, high‑impact projects, working with modern platforms and technologies while collaborating with some of the most established developers locally and internationally. As part of our strategic expansion, DVT is embarking on a targeted recruitment drive to build a strong pipeline of experienced Data Engineers in the UAE.
This initiative is aimed at ensuring we are well positioned to support upcoming client engagements in the region, enabling us to move quickly and effectively as new projects come online.
Senior Data Engineers at DVT play a key role in shaping robust, scalable data platforms that support critical business outcomes for our clients. Working in a collaborative, consulting‑led environment, they contribute to the design and evolution of data solutions that are reliable, efficient, and aligned with long‑term business objectives. DVT is deeply committed to the growth and development of its people.
We foster a strong culture of continuous learning, knowledge sharing, and technical excellence through ongoing training, internal speaking opportunities, and participation in sponsored technical events across the broader technology ecosystem.
Senior Data Engineer
Position Overview
The Senior Data Engineer is a senior consulting role responsible for designing, building, and delivering enterprise‑grade data and analytics platforms, with a strong focus on Databricks‑based lakehouse architectures. This role requires deep hands‑on experience in data migration, ETL/ELT development, data architecture, governance, and cloud‑native platform builds on Microsoft Azure and AWS. The Senior Data Engineer will work closely with solution architects, analysts, data scientists, and business stakeholders to deliver secure, scalable, and well‑governed analytics solutions.
TechnicalKnowledge
Strong knowledge and extensive hands‑on experience in:
- Databricks platform implementation including Apache Spark, Delta Lake, Unity Catalog, Notebooks, DLT (Delta Live Tables), Lakeflow and AI/BI Genie
- Advanced data engineering using PySpark, Python, and SQL for large‑scale data processing
- Designing and implementing lakehouse architectures and analytics platforms
- ETL / ELT pipeline development for batch and streaming data use cases
- Data migration from on‑premise, legacy data warehouses, and ETL tools to cloud‑based lakehouse platforms
- Data modelling techniques including dimensional modelling, star schemas, and medallion architecture
- Data governance, security, and access control using Unity Catalog and cloud‑native security services
- Microsoft Azure data services:
Azure Databricks, Azure Data Lake Storage (ADLS Gen2), Azure Data Factory (ADF), Synapse Analytics, Azure Key Vault - AWS data services: S3, Glue, EMR, Redshift, IAM, and Cloud Watch
- Data orchestration and scheduling using ADF, Databricks Workflows, Airflow or similar tools
- Performance tuning, cost optimisation, monitoring, and data reliability best practices
- CI/CD and Dev Ops practices for data platforms using Git, Azure Dev Ops, or similar tools
- Strong technical leadership with the ability to own and drive complex delivery outcomes
- Excellent analytical, problem‑solving, and troubleshooting skills
- Ability to communicate complex technical concepts clearly to non‑technical stakeholders
- Consulting mindset with strong client engagement and stakeholder management skills
- Ability to mentor junior engineers and uplift team capability
- Comfortable working in fast‑paced, ambiguous, and multi‑client environments
- Proactive, delivery‑focused, and quality‑driven approach
- Design, build, and implement scalable Databricks‑based data and analytics platforms
- Lead and execute data migration initiatives from legacy and on‑premise systems to cloud platforms
- Develop and maintain robust ETL / ELT pipelines using PySpark, Python, SQL, ADF, and AWS Glue
- Define and implement data architecture patterns aligned to enterprise and cloud…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).