More jobs:
Lead Data Engineer
Job in
San Jose, Santa Clara County, California, 95199, USA
Listed on 2026-02-17
Listing for:
MathCo
Full Time
position Listed on 2026-02-17
Job specializations:
-
IT/Tech
Data Engineer, Data Science Manager
Job Description & How to Apply Below
Job Description
We are seeking a Lead Data Engineer with expertise in Ecommerce to lead client-facing data engineering initiatives. In this role, you’ll oversee a team delivering advanced data solutions that enable sustained analytics transformation for global enterprises. You will partner with clients to design scalable data strategies, implement modern pipelines, and translate business goals into actionable technology roadmaps.
This is a hands‑on leadership role where you’ll balance technical delivery (SQL, Databricks, GCP) with program management across cross‑functional teams. Candidates with strong domain knowledge and the ability to quickly upskill on cloud technologies will thrive.
Key Responsibilities- Lead and mentor data engineering teams, ensuring delivery excellence across projects.
- Partner with retail and supply chain stakeholders to translate business needs into scalable data solutions.
- Manage programs and client engagements, aligning data initiatives with broader enterprise strategies.
- Design and oversee data pipelines, ETL/ELT processes, and data warehouse implementations.
- Apply SQL and Databricks expertise to build, optimize, and maintain high-performance data solutions.
- Collaborate with solution architects and data scientists to ensure integrated, end-to-end outcomes.
- Drive adoption of best practices in engineering, agile delivery, and program governance.
- Communicate complex concepts simply, providing actionable recommendations to executive stakeholders.
- 6+ years of experience in data engineering and program management.
- Strong domain expertise in Ecommerce.
- Proven track record managing client programs, stakeholders, and cross‑functional teams.
- Hands‑on experience with SQL and data modeling.
- Working knowledge of Databricks and GCP (with ability to deepen expertise quickly).
- Background in designing and managing end‑to‑end data pipelines, ETL/ELT workflows, and data warehouses.
- Familiarity with agile delivery practices and engineering best practices (e.g., Git).
- Bachelor’s degree in Computer Science, Engineering, or related field.
- Consulting background with experience delivering solutions to Fortune 500 clients.
- Experience with PySpark or additional cloud platforms (AWS, Azure).
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×