Data Quality Rules Analyst
Listed on 2026-02-16
-
IT/Tech
Data Analyst, Data Security
Accelerant is a data-driven risk exchange connecting underwriters of specialty insurance risk with risk capital providers. Accelerant was founded in 2018 by a group of longtime insurance industry executives and technology experts who shared a vision of rebuilding the way risk is exchanged – so that it works better, for everyone. The Accelerant risk exchange does business across more than 20 different countries and 250 specialty products, and we are proud that our insurers have been awarded an AM Best A- (Excellent) rating.
For more information, please visit (Use the "Apply for this Job" box below)..
Our objective is to reshape the value chain for our Members and Insurers using data driven insights. Our success will be based on the value data creates for our Members, risk capital providers, other suppliers and ourselves.
This role is part of a newly created and fast-growing division. The Data Office combines 3 teams;
Data Products, Data Quality and Data Management. Today, the Data Quality team consists of the Data Quality Lead and a Data Quality Support Analyst.
The Data Quality Rules Analyst is responsible for front-line ownership of the validation lifecycle, translating Data Quality intent and standards into clear, consistent, and scalable deterministic controls across ingestion, data products, and systems.
The role focuses on rule specification, catalogue governance, control behaviour, and tuning - ensuring that large volumes of data quality controls are well-structured, interpretable, and maintainable as the platform scales.
Key Responsibilities- Validation Intake & Front-Line Support
- Act as the first point of contact for new validation requests.
- Handle validation support requests (e.g. overrides, unexpected behaviour, suspected defects).
- Clarify intent, scope, expected behaviour and enforcement with Data Owners and Product.
- Determine whether requests can be handled within existing standards or require escalation.
- Escalate to the Data Quality Lead only where intent, appetite or priority is unclear or challenged.
- Validation Rule Specification
- Translate validation requirements into clear, build-ready rule specifications, including rule intent and business description, severity and enforcement level, thresholds and tolerances, expected behaviour and override guidance, ingestion validations, data product and cross-system controls, and system- and application-level checks.
- Ensure rules are unambiguous, testable, and monitorable, resolving specification-level ambiguity independently where possible.
- Own and maintain the validation catalogue content as a governed artefact, including rule metadata and descriptions, ownership mapping, lifecycle status (proposed, active, noisy, deprecated), and linkage to datasets and Atlan metadata.
- Ensure catalogue hygiene and consistency as rule volumes scale, including identifying duplicate or overlapping rules and recommending retirement of obsolete or ineffective controls.
- Control Monitoring, Interpretation & Tuning
- Interpret control behaviour and form hypotheses about root cause.
- Propose bounded refinements to thresholds or logic where evidence is clear.
- Escalate to the Data Quality Lead where changes affect agreed Data Quality appetite or enforcement posture.
- Anomaly Detection & RCA Support
- Perform first-pass interpretation of anomalies and outliers.
- Analyse data and control behaviour to support Root Cause Analysis (RCA).
- Provide evidence-based recommendations to inform DQ Lead decisions.
- Support Data Owners and Product during RCA with technical and analytical insight.
- 3–6 years’ experience in data quality, data governance, analytics engineering, or data operations roles.
- Experience working with large, complex datasets with high volumes of data quality validations.
- Strong analytical mindset, able to interpret data behaviour, trends, distributions, outliers and cross-dataset consistency.
- Proven experience interpreting data quality issues at scale and forming evidence-based recommendations on rule refinement, thresholds, and upstream process improvements.
- Experience defining and maintaining deterministic data quality controls, including clear rule intent, severity, thresholds and expected behaviour.
- Validation catalogue experience, including maintaining rule metadata, ownership and lifecycle status.
- Working knowledge of SQL sufficient to interrogate datasets independently, explore data distributions, validate assumptions behind proposed controls, and investigate unexpected validation behaviour.
- Experience supporting Root Cause Analysis (RCA) through data analysis and evidence gathering.
- Ability to operate effectively in a maturing data quality environment, applying defined standards and guardrails while tooling and processes continue to evolve.
- Strong collaboration skills, working closely with Product, Technology and Data teams to ensure controls are implemented as specified.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: