×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Engineer - Data Analytics

Job in Chicago, Cook County, Illinois, 60290, USA
Listing for: GEICO
Full Time position
Listed on 2026-01-06
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below

Join to apply for the Senior Engineer - Data Analytics role at GEICO
.

This range is provided by GEICO. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.

Base pay range

$/yr - $/yr

At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities. Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose.

When you join our company, we want you to feel valued, supported and proud to work here. That’s why we offer The GEICO Pledge:
Great Company, Great Culture, Great Rewards and Great Careers.

Position Summary

GEICO is seeking an experienced Senior Engineer with a passion for building high-performance, low maintenance, zero-downtime data solutions. You will help drive our insurance business transformation as we transition from a traditional IT model to a tech organization with engineering excellence as its mission. Within the Data Analytics and Vertical Engineering team, you will develop state‑of‑the‑art data pipelines, models, and reports, transforming vast datasets that reach up to multiple terabytes in size, while championing innovation, best practices, and continuous learning.

Position

Description

As a Senior Engineer, you will work to provide an excellent user experience for our internal stakeholders across the organization and maintain the highest standards of data and analytics engineering. Our team thrives and succeeds in delivering high‑quality data solutions in a hyper‑growth environment where priorities shift quickly. The ideal candidate has broad and deep technical knowledge, typically ranging from data processing and pipeline development to dimensional data modeling and reporting.

Position

Responsibilities
  • Scope, design, and build scalable, resilient distributed systems
  • Utilize programming languages like Python, SQL, and No

    SQL databases, along with Apache Spark for data processing, dbt for data transformation, container orchestration services such as Docker and Kubernetes, and various Azure tools and services
  • Utilize your passion for data exploration to produce high‑quality reports with tools such as Power BI and Apache Superset to empower outstanding business decisions
  • Use your technical expertise to shape product definitions and drive towards optimal solutions
  • Lead in design sessions and code reviews with peers to elevate the quality of engineering across the organization
  • Engage in cross‑functional collaboration throughout the entire development lifecycle
  • Manage data pipelines, ensuring consistent data availability
  • Mentor other engineers
  • Consistently share best practices and improve processes within and across teams
Qualifications
  • Advanced programming experience and big data experience within Python, SQL, dbt, Spark, Kafka, Git, Containerization (Docker and Kubernetes)
  • Advanced experience with Data Warehouses (Snowflake preferred), dimensional modeling, and analytics
  • Demonstrable knowledge of business intelligence tools (Power BI and Apache Superset preferred)
  • Experience with Apache Iceberg for managing large‑scale tabular data in data lakes is a plus
  • Experience with orchestration tools such as Apache Airflow or similar technologies to automate and manage complex data pipelines
  • Experience architecting and designing new ETL and BI systems
  • Experience with supporting existing ETL and BI systems
  • Experience with CI/CD to ensure smooth and continuous integration and deployment of data solutions
  • Ability to balance the competing needs of multiple priorities and excel in a dynamic environment
  • Advanced understanding of Dev Ops concepts including Azure Dev Ops framework and tools
  • Knowledge of developer tooling across the data development life cycle (task management, source code, building, deployment, operations, real‑time communication)
  • Understanding of microservices oriented architecture and REST APIs and GraphQL
  • Advanced understanding of data quality monitoring and automated testing
  • Strong problem‑solving ability
  • Experience…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary