×
Register Here to Apply for Jobs or Post Jobs. X

Snowflake Modeler​/Developer; Azure Focus

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: xCroTek
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Snowflake Modeler/Developer (Azure Focus)

Job description

About xCro Tek

xCroTek is a product- and service-based AI software company that builds innovative, AI-powered solutions to drive intelligent automation and digital transformation. We specialise in creating scalable, impactful technologies for modern businesses.

Job Overview

We are seeking a talented Snowflake Modeler/Developer with 3-5 years of experience to contribute to our data engineering team by designing and implementing data models and pipelines on Snowflake hosted on Azure. In this role, you will support the Medallion Architecture framework, reverse-engineer source systems, and build efficient, scalable data solutions to drive analytics and business insights. The ideal candidate is a proactive developer who thrives in collaborative Azure environments, with strong skills in data modeling and Snowflake features.

This mid-level position provides opportunities to grow into architecture responsibilities while working on impactful data modernisation projects.

Key Responsibilities

  • Implement Snowflake data pipelines aligned with the Medallion Architecture, including Bronze (raw ingestion), Silver (cleansed and enriched), and Gold (business-ready) layers.
  • Research and analyse source systems (e.g., relational databases, APIs, flat files) to document tables, columns, relationships, and business logic in current data processing flows.
  • Design and develop Snowflake-compatible data models using Erwin for logical and physical designs; generate DDL scripts; and produce comprehensive source-to-target mappings for data transformations.
  • Build and maintain data ingestion and processing features in Snowflake, such as Snowpipe for near-realtime loading, Streams for CDC, Tasks for automation, and materialised views for performance.
  • Integrate Snowflake with Azure services like Azure Data Factory, Synapse Analytics, and Data Lake Storage Gen2 to ensure seamless ETL/ELT workflows and data governance.
  • Collaborate with data architects, engineers, and business stakeholders to refine requirements, test models, and deliver high-quality data assets.
  • Perform query optimisation, data validation, and troubleshooting to maintain data integrity and performance in Azure-hosted environments.
  • Document designs, mappings, and code; participate in code reviews; and support timely project delivery through agile practices.

Required Qualifications

  • Bachelor's degree in Computer Science, Information Systems, or a related field.
  • 3-5 years of experience in data modeling, development, and engineering, with at least 2 years focused on Snowflake implementations.
  • Hands‑on experience implementing Medallion Architecture for layered data processing in Snowflake.
  • Proficiency in researching and documenting source systems, including metadata extraction and business rule analysis.
  • Strong skills in data modeling with Erwin, including creating DDLs and source-to-target mapping artefacts for Snowflake compatibility.
  • Experience designing and developing Snowflake native tools:
    Snowpipe, Streams, Tasks, Time Travel, and clustering keys.
  • Solid SQL expertise for data manipulation, querying, and optimisation in cloud environments.
  • Knowledge of Azure integrations, such as Azure AD authentication, external stages, and compliance with Azure security standards (e.g., Azure Policy).

Preferred Qualifications

  • Familiarity with ELT tools like dbt or Azure Data Factory for pipeline orchestration in Medallion setups.
  • Exposure to big data processing (e.g., Spark on Azure Databricks) alongside Snowflake.
  • Snowflake Snow Pro Core certification or Azure DP100/DP203 certifications.
  • Experience with data quality tools (e.g., Great Expectations) and version control (e.g., Git) for collaborative development.

Technical Skills

  • Core:
    Snowflake (Snowpipe, Streams, Tasks, Medallion Architecture), SQL, Data Modeling (Erwin, DDLs, Sourceto Target Mapping)
  • Azure Integration:
    Azure Data Factory, Synapse, ADLS Gen2, Azure AD
  • Tools & Technologies: dbt, Fivetran;
    Source System Analysis (SQL Server, Oracle, etc.)
  • Concepts:
    Data Lineage, Performance Tuning, ELT Development, Azure Security

Soft Skills

  • Excellent communication skills to clearly convey technical details to diverse audiences and foster team collaboration.
  • Demonstrated ability to meet deadlines and deliver high-quality work in a dynamic, project-driven Azure setting.
  • Detail-oriented with a collaborative spirit and eagerness to learn and adapt.

How to Apply

Interested candidates should submit their resume and a brief cover letter explaining their interest in the role to  Please include "Snowflake Modeler/Developer (Azure Focus)" in the subject line.

Apply with a one-page cover letter and CV to

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary