×
Register Here to Apply for Jobs or Post Jobs. X

Lead Data Engineer

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: iTech India
Full Time position
Listed on 2026-02-07
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below

We are looking for a Lead Data Engineer to drive the development of modern data platform. This role will focus on building scalable and reliable data pipelines using tools like DBT
, Snowflake
, and Apache Airflow
, and will play a key part in shaping data architecture and strategy.

As a technical leader, you’ll work closely with cross-functional teams including analytics, product, and engineering to deliver clean, accessible, and trustworthy data for business decision-making and machine learning use cases.

Key Responsibilities
  • Lead the design and implementation of ELT pipelines using DBT and orchestrate workflows via Apache Airflow
    .
  • Design, implement, and maintain robust data models to support analytics and reporting.
  • Architect and optimize our cloud data warehouse in Snowflake
    , ensuring performance, scalability, and cost efficiency.
  • Collaborate with data analysts and stakeholders to model and deliver well-documented, production-grade datasets.
  • Establish data engineering best practices around version control, testing, CI/CD, and observability.
  • Build and maintain data quality checks and data validation frameworks.
  • Mentor junior data engineers and foster a strong engineering culture within the team.
  • Collaborate on data governance efforts, including metadata management, data lineage, and access controls.
  • Evaluate and integrate new tools and technologies to evolve our data stack.
Requirements
  • 8+ years of experience in data engineering with at least 2 years in a lead role.
  • Strong experience designing and managing data pipelines with DBT and Airflow
    .
  • Proven expertise in data modeling techniques (dimensional modeling, star/snowflake schemas, normalization, denormalization) and translating business requirements into scalable data models.
  • Deep understanding of Snowflake
    , including performance tuning and cost optimization.
  • Strong SQL and Python skills for data transformation and automation.
  • Experience with Git-based workflows and CI/CD for data pipelines.
  • Excellent communication skills and experience working with cross-functional teams.
  • Experience with data cataloging and lineage tools
  • Exposure to event-driven architectures and real-time data processing.
  • Understanding of data privacy and security standards
Education
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary