×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Los Angeles, Los Angeles County, California, 90079, USA
Listing for: COVU
Full Time position
Listed on 2026-01-02
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 200000 - 250000 USD Yearly USD 200000.00 250000.00 YEAR
Job Description & How to Apply Below

Join to apply for the Senior Data Engineer role at COVU

COVU is a venture‑backed technology startup transforming the insurance industry. We empower independent agencies with AI‑driven insights and digitized operations, enabling them to manage risk more effectively. Our team is building an AI‑first company set to redefine the future of insurance distribution.

Role Overview

We are seeking an experienced and product‑focused Senior Data Engineer to be a core member of our Platform product team. This is a high‑impact role where you will play a pivotal part in evolving our core data infrastructure.

Your primary mission will be to develop key components of our “Policy Journal” – the foundational data asset that will serve as the single source of truth for all policy, commission, and client accounting information. You will work closely with the Lead Data Engineer and business stakeholders to translate requirements into robust data models and scalable pipelines that drive analytics and operational efficiency for our agents, managers, and leadership.

This role requires a blend of greenfield development, strategic refactoring of existing systems, and a deep understanding of how to create trusted, high‑quality data products.

What You’ll Do
  • Develop the Policy Journal:
    Be a primary builder of our master data solution that unifies policy, commission, and accounting data from sources like IVANS and Applied EPIC. You will implement the data models and pipelines that create the “gold record” powering our platform.
  • Ensure Data Quality and Reliability:
    Implement robust data quality checks, monitoring, and alerting to ensure the accuracy and timeliness of all data pipelines. You will champion and contribute to best practices in data governance and engineering.
  • Build the Foundational Analytics Platform:
    Implement and enhance our new analytics framework using modern tooling (e.g., Snowflake, dbt, Airflow). You will build and optimize critical data pipelines, transforming raw data into clean, reliable, and performant dimensional models for business intelligence.
  • Modernize Core ETL Processes:
    Systematically refactor our existing Java & SQL (Postgre

    SQL) based ETL system. You will identify and resolve core issues (e.g., data duplication, performance bottlenecks), strategically rewriting critical components in Python and migrating orchestration to Airflow.
  • Implement Data Quality Frameworks:
    Working within our company's QA strategy, you will build and execute automated data validation frameworks. You will be responsible for writing tests that ensure the accuracy, completeness, and integrity of our data pipelines and the Policy Journal.
  • Collaborate and Contribute to Design:
    Partner with product managers, the Lead Data Engineer, and business stakeholders to understand complex business requirements. You will be a key technical contributor, translating business needs into well‑designed and maintainable solutions.
What We're Looking For
  • 5+ years of experience in data engineering, with a proven track record of building and maintaining scalable data pipelines in production.
  • Expert‑level proficiency in Python and SQL.
  • Strong experience with modern data stack technologies, including a cloud data warehouse (Snowflake or Redshift), a workflow orchestrator (Airflow is highly preferred), and data transformation tools.
  • Hands‑on experience with AWS data services (e.g., S3, Glue, Lambda, RDS).
  • Experience in the insurance technology (insurtech) industry and familiarity with insurance data concepts (e.g., policies, commissions, claims).
  • Demonstrated ability to contribute to the design and implementation of robust data models (e.g., dimensional modeling) for analytics and reporting.
  • A pragmatic problem‑solver who can analyze and refactor complex legacy systems. While you won't be writing new Java code, the ability to read and understand existing Java/Hibernate logic is a strong plus.
  • Excellent communication skills and the ability to collaborate effectively with both technical and non‑technical stakeholders.
Bonus Points For
  • Direct experience working with data from Agency Management Systems like Applied EPIC, Nowcerts, EZlynx, etc…
  • Direct experience working with Carrier data (Accord XML, IVANS AL3)
  • Experience with business intelligence tools like Tableau, Looker, or Power BI.
  • Prior experience in a startup or fast‑paced agile environment.
Application Process
  • Intro call with People team
  • Technical interviews

    Final interview with leaders

Location:

Los Angeles, CA. Salary: $200,000–$250,000.

Seniority level:
Mid‑Senior level.

Employment type:

Full‑time. Job function:
Information Technology. Industry: Insurance.

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary