×
Register Here to Apply for Jobs or Post Jobs. X

Senior Product Manager, Data Platform - Lakehouse

Job in Palo Alto, Santa Clara County, California, 94306, USA
Listing for: GEICO
Full Time position
Listed on 2025-12-30
Job specializations:
  • IT/Tech
    Data Analyst, Data Science Manager, Cloud Computing, Data Engineer
Job Description & How to Apply Below

Senior Product Manager, Data Platform - Lakehouse Base pay range

$/yr - $/yr

At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities. Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose. When you join our company, we want you to feel valued, supported and proud to work here.

That’s why we offer The GEICO Pledge:
Great Company, Great Culture, Great Rewards and Great Careers.

GEICO is looking for an accomplished, customer-obsessed, and results-oriented Senior Product Manager to lead our Data Lakehouse platform. This role will drive strategic direction for our Lakehouse infrastructure, focusing on both front-end analytics and compute client capabilities. The ideal candidate will have a strong technical background in data platforms and a proven track record of delivering complex, scalable solutions.

Description

As a Senior Product Manager for the Data Lakehouse platform, you will be responsible for defining and executing the product vision for GEICO's data lakehouse products. You will work across two key areas: front-end analytics platform supporting services like Jupyter Notebooks and Apache Superset, and the compute client and integration layer. This role requires deep technical understanding of data platforms and lakehouse architectures, strong stakeholder management skills, and the ability to bridge technical solutions with business value.

Key Responsibilities
  • Develop and execute a comprehensive platform vision aligned with business goals and customer needs.
  • Create and maintain a clear, prioritized roadmap that balances short-term delivery with long-term strategic objectives.
  • Evangelize the Lakehouse platform across the organization and drive stakeholder alignment.
  • Stay abreast of industry trends and competitive landscape (Databricks, Snowflake, etc.) to inform platform strategy.
  • Lead requirement gathering and product strategy for front-end data tools like Jupyter Notebook and Apache Superset integration.
  • Understand end-to-end AIML operations workflows and how notebooks fit into the broader data ecosystem.
  • Drive data governance initiatives and cross-team collaboration with other data teams.
  • Ensure platform adheres to regulatory, compliance, and data quality standards.
  • Own and partner with engineering on the development of client layers that increase adoption of the Lakehouse compute platform.
  • Define product capabilities for client libraries, templates, and integration frameworks to improve platform accessibility.
  • Conduct customer roadshows and training on compute platform capabilities.
  • Build instrumentation and observability into the client layer to enable data-driven production decisions.
  • Work closely with engineering, design, and data teams to ensure seamless product delivery.
  • Partner with customer success, support, and engineering teams to create clear feedback loops.
  • Translate technical capabilities into business value and user benefits.
  • Drive alignment across multiple stakeholders and teams in complex, ambiguous environments.
Qualifications

Required

  • Strong understanding of data Lakehouse architectures, query engines, and compute frameworks (Spark, Trino, Databricks, Snowflake).
  • Experience building APIs, SDKs, or client integration layers for large-scale platforms.
  • Familiarity with instrumentation, telemetry, and observability practices.
  • Experience in cloud data ecosystems (Snowflake, AWS, GCP, Azure).
  • Proven analytical and problem-solving abilities with a data-driven approach to decision-making.
  • Experience working with Agile methodologies and tools (JIRA, Azure Dev Ops).
  • Excellent communication, stakeholder management, and cross-functional leadership skills.
  • Exceptional organizational skills with proven ability to manage complex backlogs.

Preferred

  • Previous experience as a software or data engineer is a plus.
  • Strong business acumen to prioritize features based on customer value and business impact.
  • Experience with Jupyter Notebooks and Apache Superset.
  • Knowledg…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary