×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Quality Analyst

Job in Wilmington, New Castle County, Delaware, 19894, USA
Listing for: Bayforce
Full Time position
Listed on 2026-02-10
Job specializations:
  • IT/Tech
    Data Security, Data Engineer
Job Description & How to Apply Below

Duration

Initially the contract would be for year-end 2026, but then extendable until December 2027.

Preferred Location

4 days onsite in Wilmington, DE or Buffalo.

Role Description

The Senior Data Quality Analyst is a hands on data quality engineer responsible for building automation first controls, monitoring, and remediation  role designs, scripts, and operationalizes data quality validation pipelines, integrates with observability platforms, and leverages APIs to orchestrate end to end workflows across cloud and on prem environments.

  • Engineer automated data quality pipelines that detect, diagnose, and remediate data defects; design for scalability, idempotency, and observability.
  • Use Python/SQL for data profiling, rule evaluation, schema validation, anomaly detection, and automated exception workflows.
  • Implement API-driven integrations (REST/Graph

    QL/SDKs) with DQ, catalog, ticketing, notification, and orchestration systems; handle OAuth2, pagination, rate limits, retries, and backoff.
  • Define and maintain DQ rule libraries and validation frameworks as reusable, versioned assets; enforce code quality via unit tests, linting, and static analysis.
  • Embed automated controls into ETL/ELT and data integration jobs; implement pre/post load validations, row/column level checks, and SLA/SLO monitoring.
  • Build event driven alerts and webhook workflows to route exceptions to the right queues (e.g., Jira/Service Now), with metadata for reproducibility and audit.
  • Develop KPI dashboards and operations apps using Power BI and Power Apps to visualize coverage, drift, and rule performance, and to streamline triage/remediation.
  • Conduct root cause analysis using lineage, logs, and metrics; implement automated remediation where feasible (e.g., rollback, quarantine, replay).
  • Contribute to CI/CD workflows (Git Lab or equivalent) for DQ assets: automated tests, environment promotion, secrets management, and change controls.
  • Author and maintain runbooks, design docs, and operational playbooks; mentor analysts on scripting, APIs, and engineering best practices.
  • Uphold risk, regulatory, and internal control requirements; ensure auditability and traceability throughout DQ processes.
  • Perform other duties as assigned.
Requirements
  • Bachelor's degree and a minimum of 5 years related experience; or in lieu of a degree, a combined minimum of 9 years higher education and/or work experience, including a minimum of 5 years related experience.
  • Advanced proficiency in Python and SQL for automation, including building reusable libraries, scheduled jobs, and validation pipelines.
  • API engineering experience: designing and consuming REST/Graph

    QL APIs, handling auth (OAuth2/Bearer), pagination, rate limits, error handling/retries, and webhooks for event-driven processes.
  • Experience building solutions with Power BI and Power Apps for operational reporting and workflow automation.
  • Demonstrated success collaborating across engineering, governance, and business teams in fast paced environments.
Preferred
  • Experience with automated ETL/ELT validation and data integration platforms; familiarity with pre/post load checks, data contracts, and schema enforcement.
  • Proficiency with DQ/observability platforms (e.g., Informatica Cloud DQ, Monte Carlo, Anomalo, Collibra OwlDQ) and their SDKs/APIs.
  • Hands on data profiling, rule design, and automated measurement of DQ dimensions (completeness, validity, accuracy, timeliness, uniqueness, consistency).
  • Implemented governance aligned DQ frameworks (standards, metadata contracts, lineage, policy as code).
  • Experience with workflow/automation tools (Power Apps, Alteryx, or equivalent) and messaging/queueing for event driven triage.
  • Cloud experience (Azure, Snowflake): scripting with providers/SDKs, secrets management, job orchestration, and cost aware design.
  • CI/CD in Git Lab (or similar): pipeline design, automated testing, code scanning, environment promotion, and approvals.
  • Exposure to AI/ML based anomaly detection or statistical monitoring; ability to integrate model outputs via APIs.
  • Excellent communication skills—able to explain engineering decisions and automation tradeoffs to non technical stakeholders.
  • Proven ability to manage multiple concurrent automation initiatives and deliver high quality solutions on schedule.
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary