More jobs:
Job Description & How to Apply Below
Mission
Design and maintain robust data pipelines that move and transform information from diverse systems into the data warehouse —reliably, efficiently, and at scale.
Core Responsibilities- Architect, build, and monitor end-to-end data ingestion pipelines (batch and incremental).
- Implement CDC, schema evolution handling, and automated validation frameworks.
- Manage data lake → warehouse flows using modern orchestration and CI/CD principles.
- Establish and enforce SLAs, lineage, and observability standards.
- Optimize warehouse performance, cost, and reliability.
- Ensure compliance, security, and resilience across all data movement layers.
- Advanced SQL and Python.
- Deep understanding of warehouse internals, particularly Snowflake optimization.
- Experience with ETL/ELT tools, API integrations, and orchestration frameworks.
- Strong grounding in data reliability engineering and automation.
Pipeline uptime, freshness, data quality, and cost efficiency.
AdditionalThis role is 5 days a week on-site in Dubai. Applicants should be comfortable in a high ownership start-up culture leading 0-1 initiatives.
#J-18808-LjbffrPosition Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×