×
Register Here to Apply for Jobs or Post Jobs. X

Data Integrity Engineer

Job in Chicago, Cook County, Illinois, 60290, USA
Listing for: Belvedere Trading, LLC
Full Time position
Listed on 2026-01-04
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below

Join to apply for the Data Integrity Engineer role at Belvedere Trading, LLC
.

Belvedere Trading is a leading proprietary trading firm headquartered in downtown Chicago. Our traders provide liquidity to the market through market‑making activities across commodities, interest rates, ETFs, and equity index options. From the beginning, we have iteratively invested in our proprietary technology, building systems from the ground up. Our trading models and software are continually re‑engineered, optimized, and maintained to stay ahead of the industry.

This success is powered by our technology teams, who innovate and perfect our solutions.

The Data Platform team delivers high‑quality data solutions to our core trading and affiliated teams. We support end users who rely on data for analytics and research, as well as those who depend on the database infrastructure backing production trading systems. Our mission is to ensure that well‑curated, trustworthy data is available to the right people at the right time, along with the tools needed to perform complex analytics and make data‑driven decisions.

The Data Integrity Engineer ensures that Belvedere’s data is accurate, consistent, and reliable across systems and over time. This role designs, implements, and maintains data pipelines and validation processes, including checks for completeness, consistency, and quality. The engineer monitors key datasets and dashboards, investigates anomalies, and partners with data engineers, quants, analysts and business stakeholders to resolve root causes and prevent recurrence.

She/he defines and documents data standards, business rules, and lineage so teams can confidently use data for reporting, analytics, and decision‑making. This hands‑on role uses SQL, Python and modern data platforms to build trust in Belvedere’s data.

What You'll Do
  • Design, build, and operate reliable data pipelines and analytics that power critical metrics and dashboards across the firm.
  • Serve as a central point of contact for data requests from trading and technology teams, helping stakeholders understand what data exists and how to use it effectively.
  • Partner with stakeholders to translate business questions into clear data requirements and set expectations around effort, complexity, and timelines for new pipelines and data sources.
  • Implement and iterate on metrics, reports, and analyses for both Trading and Technology, validating results and resolving discrepancies.
  • Define, implement, and maintain data quality checks (e.g., validation rules, reconciliations, anomaly detection) to ensure accuracy, completeness, and timeliness of key datasets.
  • Monitor data pipelines, dashboards, and quality indicators; proactively diagnose and remediate performance, reliability, and integrity issues.
  • Collaborate with other teams to modernize legacy processes, improve performance, reduce operational risk, and roll out new platform capabilities.
  • Document data definitions, business rules, lineage, and usage patterns so teams can confidently self‑serve and trust the data they consume.
  • Participate in incident response for data‑related issues, drive root‑cause analysis, and implement preventative improvements.
  • Develop a deep understanding of our data sources, trading workflows, and quantitative analyses through hands‑on querying, exploration, and collaboration with quants and engineers.
  • Contribute to establishing and evolving standards, best practices, and tooling for data integrity across the Data Platform team.
  • Occasionally perform duties outside of trading hours, including weekends and holidays, as required for scheduled maintenance or to address unforeseen emergencies.
What You'll Need
  • Professional experience in data integrity engineering, data engineering, analytics engineering, or a similar data‑focused role.
  • Strong SQL skills and a solid understanding of relational and non‑relational data modeling.
  • Experience building, scheduling, and monitoring data pipelines using orchestration or workflow tools (e.g., Airflow, Dataform).
  • Exposure to modern cloud data warehouses and tooling (e.g., Big Query, Snowflake, Redshift).
  • Proficiency with Python scripting.
  • Experience with data…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary