More jobs:
Data Quality Engineer
Job in
South Tangerang, Banten, Indonesia
Listed on 2026-02-05
Listing for:
SIRCLO
Full Time
position Listed on 2026-02-05
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst, Data Warehousing, Data Security
Job Description & How to Apply Below
Responsibilities
- Define and execute data quality criteria and test strategies for batch and real-time (streaming) data pipelines.
- Build and maintain automated data quality checks across all data layers (medallion architecture), including schema validation, completeness, uniqueness, timeliness, referential integrity, and consistency.
- Proactively monitor and analyze data quality metrics, investigate anomalies, inconsistencies, and defects, and coordinate timely resolution.
- Collaborate within the data engineering and data intelligence squads to implement QA processes in CI/CD for analytics code (e.g., dbt runs / tests).
- Advance data observability practices by developing alerting, dashboards, and SLAs / SLOs for data freshness and correctness.
- Document and validate data lineage (from source to transformation to reporting), ensuring traceability of key fields and metrics.
- Contribute to defining data contracts and acceptance criteria for datasets and reports.
- Support data governance activities (e.g., documenting standards, data contracts, change management protocols).
- Maintain and enhance data quality documentation, including data dictionaries and validation logic.
- 3+ years of experience as a data analyst, data engineer, analytics engineer, or similar role.
- Preferred:
Educational background in a STEM field. - Strong SQL skills and direct experience or familiarity with Big Query, MySQL, Postgre
SQL, Databricks, and Click House. - Hands-on experience with dbt (models, tests, docs, exposures, sources, and test strategies).
- Experience validating datasets in a medallion architecture, and understanding of dimensional modeling (e.g., star schema) and semantic layers.
- Experience or familiarity with BI / analytics platforms such as Apache Superset, Looker, Tableau, Redash, Metabase, or Power
BI. - Familiarity with both streaming and batch data pipelines; understanding of CDC (Change Data Capture) concepts.
- Proficient in documenting data lineage and data quality test processes.
- Understanding of data observability frameworks, SLAs/SLOs, and incident/root cause analysis for recurring data issues.
- Ability to communicate and collaborate with data, engineering, and other non-technical teams.
- Familiarity with data quality / observability tooling (e.g., Great Expectations, Soda, Monte Carlo, Datadog).
- Experience with orchestration tools (e.g., Airflow) and CI/CD workflows for data pipelines.
- Exposure to data governance practices (data contract documentation, cataloging, and standards).
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×