×
Register Here to Apply for Jobs or Post Jobs. X

Data Warehouse Engineer

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: Leader IT Private Limited
Full Time position
Listed on 2025-11-29
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: Snowflake

We’re looking for a Data Warehouse Engineer (Junior/Mid-Level) to help design, develop, and maintain a modern data warehouse environment. You’ll work on ingesting data from multiple sources, transforming it into analytics-ready models, and ensuring high performance, reliability, and data quality. This role sits at the intersection of data engineering and analytics, with a strong focus on warehouse best practices.

Key Responsibilities

Data Warehouse Development

  • Build and maintain ETL/ELT processes to load data into the warehouse from various internal/external sources
  • Develop and manage warehouse schemas, tables, and views to support analytics and reporting
  • Implement data quality checks, validation, and lineage tracking

Data Modeling & Performance Tuning

  • Design efficient star/snowflake schemas for reporting and BI tools
  • Write optimized SQL for data transformations and queries
  • Monitor and tune warehouse performance (partitioning, indexing, clustering)

Automation & Orchestration

  • Use workflow/orchestration tools (Airflow, Prefect, dbt, Dagster or similar) to automate warehouse pipelines
  • Develop reusable Python scripts/libraries for data transformations
  • Work closely with analysts, BI developers and data scientists to understand data needs
  • Provide analytics-ready datasets and documentation
  • Support troubleshooting and resolve data issues in the warehouse
Required Qualifications

Technical Skills

  • 2–4 years of experience in data engineering or data warehouse development
  • Strong SQL skills; experience with relational databases and data warehousing concepts
  • Proficiency in Python for ETL/ELT and data processing (pandas, PySpark or similar)
  • Familiarity with data modeling (star/snowflake, dimensional modeling)
  • Exposure to at least one modern cloud data warehouse (Snowflake, Big Query, Redshift, Synapse, etc.)
  • Basic understanding of cloud services (AWS/GCP/Azure)

Soft Skills

  • Good problem-solving and analytical thinking
  • Ability to write clear documentation for data models and pipelines
  • Effective communication with both technical and non-technical stakeholders
  • Eagerness to learn and improve warehouse best practices
Skills Required

Python, SQL, Cloud Platforms, Data modeling, Data Warehouse Engineering, ETL/ELT, Dimensional Modeling, Cloud Data Warehouses, Snowflake, Big Query, Redshift, Synapse, Workflow Orchestration, Airflow, Analytics, Agile Software Development

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary