×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer, Personal Insurance

Job in Hartford, Hartford County, Connecticut, 06112, USA
Listing for: Travelers
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below

Who Are We?

Taking care of our customers, our communities and each other. That’s the Travelers Promise. By honoring this commitment, we have maintained our reputation as one of the best property casualty insurers in the industry for over 170 years. Join us to discover a culture that is rooted in innovation and thrives on collaboration. Imagine loving what you do and where you do it.

Job

Category

Technology

Compensation Overview

The annual base salary range provided for this position is a nationwide market range and represents a broad range of salaries for this role across the country. The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment.

As part of our comprehensive compensation and benefits program, employees are also eligible for performance-based cash incentive awards.

Salary Range

$ - $

Target Openings

1

What Is the Opportunity?

Travelers Data Engineering team constructs pipelines that contextualize and provide easy access to data by the entire enterprise. As a Senior Data Engineer you will accelerate growth and transformation of our analytics landscape. You will bring a strong desire to guide team members' growth and develop data solutions that translate complex data into user-friendly terminology. You will leverage your ability to design, build and deploy data solutions that capture, explore, transform, and utilize data to support Artificial Intelligence, Machine Learning and business intelligence/insights.

What

Will You Do?
  • Build and operationalize complex data solutions, correct problems, apply transformations, and recommending data cleansing/quality solutions.
  • Design complex data solutions, including incorporating new data sources and ensuring designs are consistent across projects and aligned to data strategies.
  • Perform analysis of complex sources to determine value and use and recommend data to include in analytical processes.
  • Incorporate core data management competencies including data governance, data security and data quality.
  • Act as a data and technology subject matter expert within lines of business to support delivery and educate end users on data products/analytic environment.
  • Perform data and system analysis, assessment and resolution for defects and incidents of high complexity and correct as appropriate.
  • Collaborate across team to support delivery and educate end users on complex data products/analytic environment.
  • Perform other duties as assigned.
What Will Our Ideal Candidate Have?
  • Bachelor’s Degree in STEM related field or equivalent
  • Ten years of related experience
  • Primary

    Job Requirements:
    • Architect and design scalable, secure data solutions using AWS, Databricks, and Ab Initio.
    • Lead technical direction for data engineering initiatives across cloud and on‑premises infrastructure.
    • Hands‑on development: build ETL pipelines, optimize Spark jobs, and create Ab Initio graphs.
    • Troubleshoot production issues and provide technical guidance to junior engineers.
    • Conduct mentoring sessions and offer technical guidance to the 20‑person admin team.
    • Collaborate with DBA teams, business analysts, and QA teams to ensure data governance and quality.
    • Manage infrastructure deployment and optimize cloud resources.
    • Lead technical design reviews and architecture discussions.
    • Implement data integration solutions and ensure compliance with data protection regulations.
    • Establish and enforce coding standards, best practices, and data governance policies.
  • Technical

    Skills:
    • AbInitio: Expert proficiency with GDE, Co>

      Operating System, EME, BRE, Express>

      It, metaprogramming (PDL)
    • Programming: Python, PySpark, SQL
    • Cloud: AWS architecture and services
    • Databricks: Workspace management, cluster configuration, Delta Lake, Unity Catalog
    • Data Warehousing: Strong understanding of data modeling, dimensional modeling (star/snowflake schemas)
    • ETL/ELT: End‑to‑end ETL development lifecycle
    • Version Control: Git, CI/CD pipelines
  • Advanced knowledge of tools, techniques, and manipulation including cloud platforms, programming languages, and modern software…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary