More jobs:
DBT/Informatica Lead
Job in
Chicago, Cook County, Illinois, 60290, USA
Listed on 2026-02-07
Listing for:
Compunnel, Inc.
Full Time
position Listed on 2026-02-07
Job specializations:
-
IT/Tech
Data Engineer
Job Description & How to Apply Below
The DBT/Informatica Lead will be responsible for defining the target ELT architecture using dbt on Snowflake and leading the migration of legacy Informatica workflows into modern, modular dbt pipelines.
This role requires strong expertise in data engineering, ELT modernization, Snowflake optimization, and Azure data ecosystem integration.
The ideal candidate will drive the migration strategy, establish modeling and coding standards, oversee conversion activities, and ensure high-quality, production‑grade data pipelines.
Key Responsibilities- Define the target ELT architecture using dbt on Snowflake, integrated with Azure services such as ADF, ADLS, Synapse/Databricks, and Key Vault.
- Establish modeling standards including naming conventions, layer structure (staging, integration, marts), and package organization.
- Define and implement performance‑optimized patterns in dbt and Snowflake (incremental models, clustering, partitioning, query tuning).
- Lead the overall migration strategy and roadmap for converting Informatica workflows to dbt on Snowflake.
- Translate legacy Informatica mappings, sessions, and workflows into modular dbt models.
- Analyze Informatica ETL logic, dependencies, and schedules to design equivalent or improved dbt logic.
- Build a repeatable “migration factory” including templates, accelerators, mapping sheets, and conversion playbooks.
- Oversee conversion, unit testing, and parallel runs to validate output parity (row counts, aggregates, business rules).
- Lead hands‑on development of dbt models, seeds, snapshots, macros, tests, and documentation.
- Define and implement dbt testing strategy (schema tests, data tests, custom tests) integrated with broader data quality frameworks.
- Set up and maintain dbt environments (dev/test/prod), profiles, and Snowflake connections.
- Enforce code quality practices such as modular design, reusable packages, code reviews, and PR workflows.
- Integrate dbt with Azure Data Factory, ADLS, Azure Dev Ops/Git Hub for orchestration and Dev Ops enablement.
- Optimize Snowflake performance through warehouse design, schema engineering, and compute optimization.
- Ensure secure and scalable deployment of dbt pipelines within the Azure cloud ecosystem.
- 8+ years of experience in Data Engineering, ETL, and Data Warehousing.
- 3+ years of hands‑on dbt experience (Core or Cloud) building production‑grade pipelines.
- Proven experience leading an Informatica ? dbt migration on Snowflake (Azure preferred).
- Strong Snowflake expertise including schema design, performance tuning, and ELT development.
- Solid working knowledge of Azure data stack including ADF, ADLS, Azure Dev Ops/Git Hub.
- Strong understanding of data modeling standards for staging, integration, and data marts.
- Ability to analyze, optimize, and modernize complex ETL logic.
- Excellent communication and stakeholder‑collaboration skills.
- Experience with Databricks or Azure Synapse pipelines.
- Knowledge of data quality frameworks and testing automation.
- Experience designing reusable frameworks, accelerators, and best‑practice guidelines.
- Exposure to large‑scale cloud migrations or enterprise data modernization initiatives.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×