×
Register Here to Apply for Jobs or Post Jobs. X

Senior Manager of Data Engineering

Job in Palo Alto, Santa Clara County, California, 94306, USA
Listing for: GEICO
Full Time position
Listed on 2026-02-21
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
GEICO .
For more information, please .Senior Manager of Data Engineering page is loaded## Senior Manager of Data Engineering remote type:
Hybrid locations:
Palo Alto, CAtime type:
Full time posted on:
Posted Todayjob requisition :
R0062498
** At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities.
**** Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose.
**** When you join our company, we want you to feel valued, supported and proud to work here. That’s why we offer The GEICO Pledge:
Great Company, Great Culture, Great Rewards and Great Careers.
** At
** GEICO Technology Solutions**, we are on a journey to revolutionize the Insurtech space with our technology offerings in the Insurance market and provide excellent service with better efficiency to our Customers and Associates. To achieve our vision and mission, we are focusing on transforming our existing tech to deliver products and capabilities that are marketplace ready and are agnostic of the Line of Business or the Channel utilized.

We’re seeking aSenior Manager of Data Engineering to lead the evolution of our Claims Data Platform. This is a high-impact leadership role where you’ll manage our current database operations while architecting and building the next generation of data infrastructure. You’ll work at the intersection of technology and business, partnering with development teams, claims operations, and analytics to ensure claims data flows seamlessly across the enterprise.

This position sits within the Claims organization, embedded directly in the business unit that owns a significant dataset in the enterprise. You'll operate with the full context of claims operations, regulatory requirements, and business priorities, not as a centralized service, but as an integral part of how Claims technology delivers value.
** Your mission:
** Transform how our organization consumes and publishes claims data - from legacy to a modern, event driven scalable data platform with self-service APIs and robust data pipelines.
** What You’ll Do
**** Build Data Interfaces & Pipelines
*** Design and deliver APIs and data services that enable partner teams to
* Build and maintain scalable data pipelines using modern orchestration tools
* Implement data quality frameworks, monitoring, and alerting
* Create self-service data products that reduce time-to-insight for business users
** Drive Strategic Initiatives
*** Collaborate with Architecture, Security, and Compliance teams on data governance
* Champion data engineering best practices: version control, CI/CD, infrastructure-as-code
* Partner with Analytics and Data Science teams to deliver ML-ready datasets
* Report to leadership on platform health, roadmap progress, and strategic opportunities
** What You’ll Bring
**** Required Qualifications
***
* Experience:

*** 10+ years of progressive experience in data engineering, database administration, or data platform development
* 4+ years of people management experience leading technical teams (5+ direct reports)
* Proven track record of building and scaling data pipelines in production environments
* Experience managing enterprise database platforms (Oracle, Postgre

SQL, or equivalent)
* Databases:
Deep expertise in relational databases (Oracle, Postgre

SQL, MSSQL); familiarity with No

SQL (Mongo

DB, Cassandra)
* Streaming:
Working knowledge of Kafka, Kinesis, or Pulsar for real-time data streaming
* Cloud Platforms:
Strong experience with Azure or AWS, data services
* Data Warehousing:
Hands-on experience with Snowflake, Big Query, Redshift, or Databricks
* Data Pipelines:
Proficiency with Apache Airflow, dbt, or similar orchestration tools
* Programming:
Solid coding skills in Python, SQL; familiarity with Spark/Py Spark
* APIs:
Experience designing and building RESTful APIs and data services
** Preferred Qualifications
*** Insurance/Claims Domain:

Experience with Guidewire, Duck Creek, or similar claims management systems
* Data…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary