More jobs:
Job Description & How to Apply Below
Ensures data is accurate, reliable, and easy to understand by building data quality, monitoring, and anomaly detection solutions. Maintains clear end-to-end visibility of where data comes from and how it is used. Builds and supports data pipelines using SQL, Snowflake, Python, Airflow, and API integrations. Clearly explains data issues and insights to senior leaders, including the CIO.
Key Responsibilities
- Data observability and automated data quality frameworks
- Pattern recognition and anomaly detection
- End-to-end data lineage (source-to-target understanding)
- Strong technical skills in SQL, Snowflake, Python, and Airflow
- API-based integration of third-party data sources
- All candidates must demonstrate strong communication skills, with the ability to explain technical work to senior leadership, including the CIO.
- What we need?
- Bachelor’s/Master’s degree in Computer Science, Information Technology, Engineering, or related field.
- 7+ years of experience in data engineering or data platform development, with at least 1–2 years in a lead role.
- Strong hands-on experience in Snowflake (Data Warehousing, Snowpipe, Performance Optimization), Air Flow
- Proficiency in SQL and Python.
- Experience with Azure, AWS, or GCP cloud data services.
- Solid understanding of data modeling and CI/CD practices.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×