More jobs:
Engineer II - Data Lakehouse
Job in
Chevy Chase, Montgomery County, Maryland, 20815, USA
Listed on 2025-11-27
Listing for:
GEICO
Full Time
position Listed on 2025-11-27
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Systems Engineer, AI Engineer
Job Description & How to Apply Below
For more information, please .
** At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities.
**** Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose.
**** When you join our company, we want you to feel valued, supported and proud to work here. That’s why we offer The GEICO Pledge:
Great Company, Great Culture, Great Rewards and Great Careers.
**** Position Summary
** GEICO is seeking an Engineer with a passion for building high-performance, low maintenance, zero-downtime platforms, and core data infrastructure. You will help drive our insurance business transformation as we transition from a traditional IT model to a tech organization with engineering excellence as its mission, while co-creating the culture of psychological safety and continuous improvement.
** Position Description
** Our Engineer II is a key member of the engineering staff working across the organization to innovate and bring the best open-source data infrastructure and practices into Geico as we embark on a greenfield project to implement a core Data Lakehouse for all Geico’s core data use-cases across each of the company’s business verticals.
** Position Responsibilities
** As an Engineer II, you will:
* Scope, design, and build scalable, resilient Data Lakehouse components
* Lead architecture sessions and reviews with peers and leadership
* Accountable for the quality, usability, and performance of the solutions
* Spearhead new software evaluations and innovate with new tooling
* Determine and support resource requirements, evaluate operational processes, measure outcomes to ensure desired results, and demonstrate adaptability and sponsoring continuous learning
* Collaborate with customers, team members, and other engineering teams to solve our toughest problems
* Consistently share best practices and improve processes within and across teams
* Share your passion for staying on top of the latest open-source projects, experimenting with, and learning recent technologies, participating in internal and external technology communities, and mentoring other members of the engineering community
** Qualifications
* ** Exemplary ability to design and develop, perform experiments
* Experience developing new and enhancing existing open-source based Data Lakehouse platform components
* Experience cultivating relationships with and contributing to open-source software projects
* Experience with:* + Apache Superset for data visualization and business intelligence* + Jupyter Notebook for data science and machine learning development
* Experience with cloud computing (AWS, Microsoft Azure, Google Cloud, Hybrid Cloud, or equivalent)
* Expertise in developing large-scale distributed systems that are scalable, resilient, and highly available, with a focus on:* + Designing and implementing systems that can handle high traffic and large data volumes* + Ensuring system reliability, uptime, and performance in complex environments
* Expertise in container technology like Docker and Kubernetes platform development
* Experience with continuous delivery and infrastructure as code
* In-depth knowledge of Dev Ops concepts and cloud architecture
* Experience in Azure Network (Subscription, Security zoning, etc.) or equivalent•
** Desirable:
*** Experience with ML Ops pipeline development and management, including* + Designing and implementing data pipelines for machine learning workflows
Ensuring data quality, integrity, and security in ML pipelines* + Monitoring and optimizing ML pipeline performance and efficiency* + Experience working with Large Language Models (LLM) to create Agentic systems, including:
* Integrating LLMs with data lakehouse platforms and other systems* + Developing and deploying Agentic models and workflows* + Ensuring model performance, reliability, and security in production environment
* Ability to excel in a fast-paced, startup-like environment
** Experience
* ** Preferred qualifications…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×