×
Register Here to Apply for Jobs or Post Jobs. X

Lead Data Engineer - Qlik​/Snowflake​/DBT

Job in Scottsdale, Maricopa County, Arizona, 85261, USA
Listing for: First Citizens Bank
Full Time position
Listed on 2025-12-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Security
Job Description & How to Apply Below
Position: Lead Data Engineer - Qlik/ Snowflake/ DBT

Overview

This is a remote role that may only be hired in the following locations: NC, TX, AZ. You will design, build, and operate secure, audited, and cost-efficient data pipelines on Snowflake—from raw ingestion to Data Vault 2.0 models and onward to business-friendly consumption layers (mart/semantic). You'll use Qlik/Glue/ETLs for ingestion, dbt Cloud for modeling/testing, MWAA / Airflow and/or dbt Cloud's orchestration for scheduling, and Terraform (with Hashi Corp practices) for infrastructure-as-code.

The ideal candidate must have hands-on experience with data ingestion frameworks and Snowflake platform database/schema design, security, networking, and governance that satisfy regulatory and compliance audit requirements.

Responsibilities

Modeling & Warehousing

  • Design and implement scalable data ingestion frameworks
  • Implement Raw DV 2.0 (Hubs/Links/Satellites) consumption patterns in dbt Cloud with robust tests (unique/not null/relationships/freshness).

Build performant Snowflake objects (tables, streams, tasks, materialized views) and optimize clustering/micro-partitioning.

  • Orchestration
  • Author and operate Airflow (MWAA) DAGs and/or dbt Cloud jobs; design idempotent, rerunnable, lineage-tracked workflows with SLAs/SLOs.

Security & Governance

  • Enforce RBAC/ABAC, network policies/rules, masking/row access policies, tags, data classification, and least-privilege role hierarchies.
  • Operationalize audit-ready controls (change management, approvals, runbooks, separation of duties, evidence capture).

IaC & Dev Ops

  • Use CI/CD flows, Terraform, Git branching for code promotion.

Data Quality & Observability

  • Bake tests into dbt; implement contract checks, reconciliations, and anomaly alerts.
  • Monitor with Snowflake /, event tables, and forward logs/metrics to SIEM/APM (e.g., Splunk, Datadog).

Cost & Performance

  • Right-size warehouses, configure auto-suspend/auto-resume, multi-cluster for concurrency, resource monitors, and query optimization.

Compliance

  • Build controls and evidence to satisfy internal audit, SOX/GLBA/FFIEC/PCI-like expectations.
Qualifications

Bachelor's Degree and 6 years of experience in Advanced data engineering, enterprise architecture, project leadership OR High School Diploma or GED and 10 years of experience in Advanced data engineering, enterprise architecture, project leadership

Preferred:

Snowflake Platform (hands-on, production):

  • Secure account setup: databases/schemas/stages, RBAC/ABAC role design, grants, network policies/rules, storage integrations.
  • Data protection:
    Dynamic Data Masking, Row Access Policies, Tag-based masking, PII classification/lineage tagging.
  • Workloads & features:
    Streams/Tasks, Snowpipe, external tables, file formats, copy options, retries & dedupe patterns.
  • Operations: warehouse sizing, multi-cluster, resource monitors, Time Travel & Fail-safe, cross-region/account replication.
  • Networking concepts: AWS Private Link/S3 access patterns, external stages, (at least) high-level familiarity with VPC/DNS/endpoint flows.

DBT Cloud:

  • Dimensional + Data Vault 2.0 modeling in dbt (H/L/S), snapshots, seeds, exposures, Jinja/macros, packages, artifacts.
  • Testing and documentation discipline; deployment environments (DEV/QA/UAT/PROD) and job orchestration.

Orchestration:

  • Airflow (MWAA):
    Operators/Sensors (dbt, Snowflake, S3), XComs, SLAs, retries, backfills, alerting, and modular DAG design.
  • Experience deciding when to run in dbt Cloud orchestration vs Airflow, and integrating both cleanly.

Data Quality & Observability:

  • Contract tests, reconciliations, freshness SLAs, anomaly detection; surfacing lineage and test results to stakeholders.
  • Query tuning (profiling, pruning, statistics awareness, result caching).

Audit & Controls:

  • Change control with approvals/evidence, break-glass procedures, production access separation, audit log retention/immutability.
  • Runbooks, PIR/RCAs, control mapping (e.g., to SOX/GLBA/PCI-like controls where relevant).

Programming & Cloud:

  • Python (ETL utils, Airflow tasks), SQL (advanced), and AWS basics (S3, IAM, Cloud Watch, MWAA fundamentals).
Bonus Skills
  • Snowflake governance: data classification at scale, Universal Search, tags + masking automation.
  • Iceberg/external table…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary