Manager, Data Quality and Operations
Listed on 2026-01-01
-
IT/Tech
Data Engineer
Job Overview
Domino’s Pizza began in 1960 as a single store location in Ypsilanti, MI. Over time the brand has become a technology leader, consistently ranking among the top five companies in online transactions; 65 % of U.S. sales are captured through digital channels. Join our growth‑minded team to help deliver the dream to local business owners and customers worldwide.
Job DescriptionAs a Manager – Data Quality and Operations focused on enterprise data solutions, your primary responsibility is to ensure the delivery of high‑quality, reliable, and efficient data pipelines and operations across the organization. This senior technical leadership role is accountable for end‑to‑end data quality engineering and operational excellence of cloud‑based data solutions.
- Location: Domino’s World Resource Center; 30 Frank Lloyd Wright Dr, Ann Arbor, MI 48105
- Shift: Fulltime;
Salary - Job Posting Salary: $140,000–$155,000, plus bonus
- Role: Hybrid (4 days at Domino’s Headquarters, Ann Arbor) + Friday remote
- Lead Data Quality Engineering & Data QA
- Build quality in:
Engineer data pipelines with quality‑first principles aligned to business and technical requirements. - Design quality controls:
Define, implement, and maintain automated QA checks for critical data assets with thresholds and SLA‑aligned alerting and escalation. - Enterprise data quality framework:
Establish best practices and measurable DQM standards (profiling, validity, completeness, timeliness, consistency, accuracy) across domains. - Test automation at scale:
Drive in‑sprint and regression automation for batch and streaming workloads; integrate tests into CI/CD to prevent regressions and accelerate release cycles. - Coach and develop talent:
Lead a pod of QA/Data Quality specialists and raise the technical bar in SQL/Python, test design, and root‑cause analysis.
- Build quality in:
- Run Data Operations
- Own production SLAs:
Monitor and support an extensive footprint of pipelines ensuring uptime and on‑time delivery for key datasets, metrics, and downstream products. - Triage & remediate fast:
Lead incident response for data quality/availability issues; drive RCA and corrective actions; reduce MTTR through automation and playbooks. - Analyze & prevent:
Apply EDA to quantify impact (blast radius), identify failure patterns, and implement preventive controls and observability. - Harden the pipeline factory:
Mature CI/CD (branching, approvals, quality gates) and release automation; improve MFT and orchestration flows for reliability and throughput. - Build the team:
Recruit, onboard, and mentor Data Operations Analysts to support enterprise data modernization initiatives at scale. - Participate in an on‑call rotation for critical data products and platform components.
- Own production SLAs:
- Must‑have skills & experience
- Hands‑on technical leadership in data engineering, QA/quality engineering and data operations.
- Deep proficiency in SQL, ETL Tools and Python for test automation, data validation, and triage.
- Strong experience with ETL/ELT and orchestration (e.g., Control‑M, Airflow, Databricks Jobs).
- CI/CD pipelines for data (Git/Git Hub, Jenkins/Git Hub Actions) including quality gates and automated regressions.
- Familiarity with MFT platforms and secure file transfer patterns.
- Proven track record building DQ rulesets (profiling, constraints, anomaly detection) and putting them into production with monitoring and alerting.
- Production support experience: incident management, RCA, and post‑mortems with action tracking and verification.
- Strong problem‑solving skills; ability to translate requirements into executable tests and controls.
- Nice to have
- Experience with cloud data platforms (Azure/AWS/GCP) and cloud data warehouses/lake houses;
Databricks strongly preferred. - Familiarity with data warehousing, dimensional modeling, and performance tuning.
- Exposure to Customer 360/MDM and enterprise data governance.
- Experience with BI/semantic layers and data product SLAs.
- Background working with streaming (Kafka, Event Hubs) and schema management.
- Experience with cloud data platforms (Azure/AWS/GCP) and cloud data warehouses/lake houses;
- Leadership & behaviors
- Builder’s mindset with bias for automation and measurable outcomes.
- Clear, candid communicator; able to translate between engineering detail and…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).