More jobs:
Snowflake Data Engineer - W2 , No C2C
Job in
Indianapolis, Hamilton County, Indiana, 46262, USA
Listed on 2026-01-01
Listing for:
Cygnus Professionals Inc.
Full Time
position Listed on 2026-01-01
Job specializations:
-
IT/Tech
Data Engineer, Data Warehousing
Job Description & How to Apply Below
Location: Indianapolis
Snowflake Data Engineer - W2 Only, No C2C - 8+YEARS
We are seeking a Snowflake Data Engineer to design, build, and optimize scalable data solutions on the Snowflake Data Cloud.
This role will support analytics, reporting, and AI/ML initiatives across commercial, manufacturing, R&D, and quality systems.
The ideal candidate has strong expertise in cloud data engineering, ELT pipelines, and enterprise-grade data platforms within regulated environments.
Required Qualifications- Bachelor’s degree in computer science, Engineering, Data Science, or related field
- 8+ years of hands‑on experience in data engineering with strong focus on Snowflake Data Warehouse design and management
- Extensive experience designing, building, and managing enterprise‑scale Snowflake data warehouses
- Strong hands‑on experience with Snowflake (SQL, Virtual Warehouses, Snowpipe, Streams, Tasks, Time Travel, Zero Copy Cloning)
- Proven expertise in Snowflake warehouse management
, including sizing, multi‑cluster warehouses, workload isolation, concurrency scaling, and cost optimization - Proficiency in SQL and Python for data transformations and orchestration
- Experience with cloud platforms:
AWS
, Azure
, or GCP - Experience building robust ELT pipelines and working with structured and semi‑structured data (JSON, Parquet, Avro)
- Strong knowledge of data modeling for data warehouses (star/snowflake schemas, dimensional modeling)
- Experience implementing data governance, security, and access controls in Snowflake (RBAC, masking policies, row access policies)
- Experience with Git-based version control and CI/CD pipelines
- Design, develop, and maintain scalable data pipelines using Snowflake as the core data platform
- Build and optimize ELT workflows using tools such as dbt
, Airflow
, Matillion
, or Fivetran - Implement data models (star/snowflake schemas) to support analytics, BI, and advanced analytics use cases
- Optimize Snowflake performance and cost (warehouses, clustering, caching, resource monitors)
- Integrate data from diverse sources: ERP (SAP), CRM (Salesforce), manufacturing systems, LIMS, IoT, and external data feeds
- Ensure data quality, governance, lineage, and metadata management in compliance with regulatory standards (GxP, FDA, ISO)
- Collaborate with data analysts, data scientists, product teams, and business stakeholders
- Implement CI/CD, version control, and automated testing for data pipelines
- Support data security, access controls, and compliance requirements
- Participate in architecture reviews and contribute to enterprise data strategy
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×