Sr Data Analytics and Vertical Engineering - Claims Domain Engineer
Listed on 2026-01-29
-
IT/Tech
Data Engineer, Cloud Computing
Overview
At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities. Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose. When you join our company, we want you to feel valued, supported and proud to work here.
That’s why we offer The GEICO Pledge:
Great Company, Great Culture, Great Rewards and Great Careers.
GEICO is seeking an experienced Senior Data Engineer with a passion for building high-performance, low maintenance, zero-downtime data solutions. You will help drive our insurance business transformation as we transition from a traditional IT model to a tech organization with engineering excellence as its mission. Within the Data Analytics and Vertical Engineering team, you will play a key role in leveraging modern technologies to enhance our data capabilities, while championing innovation, best practices, and continuous learning.
PositionDescription
As a Senior Data Engineer, you will work to build and maintain robust data systems that power a state-of-the-art analytics platform. Our team thrives and succeeds in delivering high-quality data solutions in a hyper-growth environment where priorities shift quickly. The ideal candidate has broad and deep technical knowledge, typically ranging from data pipeline development and data transformation to data storage and processing optimization.
PositionResponsibilities
- Scope, design, and build scalable, resilient distributed systems
- Utilize programming languages like Python, SQL, and No
SQL databases, along with Apache Spark for data processing, dbt for data transformation, container orchestration services such as Docker and Kubernetes, and various Azure tools and services - Use your technical expertise to shape product definitions and drive towards optimal solutions
- Engage in cross-functional collaboration throughout the entire development lifecycle
- Lead in design sessions and code reviews with peers to elevate the quality of engineering across the organization
- Define, create, and support reusable data components and patterns that align with both business and technology requirements
- Build a world-class analytics platform to satisfy reporting needs
- Mentor other engineers
- Consistently share best practices and improve processes within and across teams
- Advanced programming experience and big data experience within Python, SQL, dbt, Spark, Kafka, Git, Containerization (Docker and Kubernetes)
- Experience with Apache Iceberg for managing large-scale tabular data in data lakes is a plus
- Experience with orchestration tools such as Apache Airflow or similar technologies to automate and manage complex data pipelines
- Experience with business intelligence tools (Power BI or Superset preferred)
- Proven understanding of microservices oriented architecture and REST APIs and GraphQL
- Experience architecting and designing new and current systems
- Advanced understanding of Dev Ops concepts including Azure Dev Ops framework and tools
- Experience with CI/CD to ensure smooth and continuous integration and deployment of data solutions
- Advanced Power Shell scripting skills
- Advanced understanding of monitoring concepts and tooling
- Advanced understanding of security protocols and products
- In-depth knowledge of CS data structures and algorithms
- Knowledge of developer tooling across the data development life cycle (task management, source code, building, deployment, operations, real-time communication)
- Strong problem-solving ability
- Ability to excel in a fast-paced environment
- 4+ years of professional experience in data engineering, programming languages and developing with big data technologies
- 3+ years of experience with architecture and design
- 3+ years of experience with AWS, GCP, Azure, or another cloud service
- 2+ years of experience in Big-data tools like Spark and Databricks
- Bachelor’s degree in Computer Science, Information Systems, or equivalent education or work experience
$75,000.00 - $
Th…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).