MDM and Data Quality Engineer
Listed on 2025-12-21
-
IT/Tech
Data Engineer, Data Analyst, Data Warehousing, Data Security
JOB DESCRIPTION
Sinclair, Inc. is seeking a Data Quality and Master Data Management Engineer to join our Enterprise Data and Insights Team (ED&I). This role is responsible for designing, developing, and maintaining Master Data Management (MDM) and Data Quality (DQ) solutions that ensure data accuracy, consistency, and governance across Sinclair's enterprise systems.
This role takes ownership of the custom-built Data Quality Foundation leveraging Snowflake, Cognos Analytics, SSRS, and ED&I engineering patterns to expand and mature the enterprise's ability to measure, monitor, remediate, and improve data quality at scale.
The engineer will design, implement, and optimize data quality rules, issue detection logic, exception reporting, and the underlying data pipelines that feed our enterprise Data Quality Dashboard. They will partner closely with the Enterprise Data Governance (EDG) Office
, acting as the technical execution arm that turns governance policies, standards, and domain-specific rules into automated data quality controls embedded into Sinclair's modern data ecosystem.
This is a hands‑on engineering role requiring strong data modeling, SQL development, metadata understanding, and business domain translation skill‑ideal for someone who enjoys building robust data assets that directly improve trust, decision‑making, and operational efficiency across the enterprise.
The ideal candidate has hands‑on experience with modern MDM and Data Quality platforms, proficiency in Snowflake, SQL, and Python, and a strong understanding of data modeling, data governance, and automation in large‑scale cloud environments. This position will work closely with data engineers, architects, and business stakeholders to implement scalable, automated, and intelligent data management solutions.
Key Responsibilities:Data Quality Engineering & Rule Development
- Own, maintain, and expand the ED&I‑built Data Quality Foundation, including Snowflake objects, SQL logic, metadata layers, and reporting structures.
- Translate data governance policies, quality standards, and field‑level requirements into technical data quality rules (completeness, uniqueness, conformity, referential integrity, validity, timeliness, etc.).
- Build, maintain, and support scalable pipelines that detect, measure, and store data quality results across multiple domains (starting with Oracle B2B Customer, expanding to Vendor, Employee, Product, and more).
- Implement automated frameworks that surface issues in real‑time or near real‑time.
Master Data Management (MDM) & Golden Record Mapping
- Analyze and map source system attributes (Oracle Fusion, CDM, legacy systems, CRM, ERP, broadcast systems, etc.) to the Golden Record structure managed by ED&I/EDG.
- Create and manage transformation logic to standardize, harmonize, and prepare data for MDM use cases.
- Collaborate with business stewards and EDG domain leaders to refine golden record attributes, survivorship rules, and lineage.
Dashboard & Reporting Ownership
- Maintain and enhance the Data Quality Dashboard delivered through Cognos Analytics and SSRS— including schema changes, measure expansion, usability improvements, and new rule integrations.
- Ensure dashboards reflect clear, executive‑ready views of data quality trends, issue volume, domain health, and field‑level KPIs.
- Manage end‑to‑end exception reporting, including daily/weekly files sent to operational teams for triage and remediation.
Collaboration & Governance Enablement
- Partner with the Enterprise Data Governance Office to understand domain priorities, evolving standards, and new data policies.
- Translate governance requirements into actionable technical specifications and automated controls.
- Act as a trusted technical advisor to Data Stewards, Data Owners, and Data Council representatives.
- Participate in stewardship working groups and provide technical insight on data quality feasibility, root cause analysis, and upstream/downstream impacts.
Technical Operations & Continuous Improvement
- Monitor performance of Snowflake workloads, optimize SQL logic, and ensure cost‑effective data quality processing.
- Maintain documentation of rules, mappings, logic, lineage, and…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).