More jobs:
Technical Program Manager; Data Platform
Job in
Plano, Collin County, Texas, 75086, USA
Listed on 2025-12-10
Listing for:
Regent Strategic Solutions Inc.
Full Time
position Listed on 2025-12-10
Job specializations:
-
IT/Tech
Data Engineer, Data Science Manager, Data Analyst
Job Description & How to Apply Below
Technical Program Manager (Data Platform)
Location:
Mountain View, CA / Plano, TX (Hybrid)
We are seeking an experienced Technical Program/Product Manager (TPM) to lead the end-to-end delivery of our modern data platform. This role involves driving multi-team programs focused on designing, building, and scaling a secure, cost-efficient Lakehouse architecture on AWS
.
You’ll work cross-functionally with data engineering, platform, security, and product teams to implement capabilities in Databricks (Delta Lake, Unity Catalog, Workflows/Jobs, Delta Live Tables) and support event-driven data products using AWS Event Bridge
, Kafka
, and Kinesis — across both real-time streaming and batch pipelines.
- Own the program delivery lifecycle for enterprise data platform initiatives.
- Coordinate roadmaps, manage dependencies, and mitigate risks across teams.
- Define and track program metrics, SLAs, and success KPIs.
- Build dashboards and executive reports to communicate program progress and business impact.
- Lead quarterly roadmap planning and prioritization efforts.
- Partner with engineering leads on design decisions, scaling strategy, and cost optimization.
- Manage delivery of new data capabilities in Databricks and event-driven environments.
- Oversee migration efforts from legacy ETL tools (Informatica or equivalent) to modern lakehouse patterns.
- Drive Fin Ops practices to optimize data compute spend.
- Apply AI-native and GenAI-agent product thinking to improve program execution.
- 7+ years of experience in technical program or product management within data platform environments.
- Deep understanding of Databricks, AWS, and event-driven architectures.
- Proven experience managing large-scale, cross-functional programs in Agile settings.
- Hands-on experience with data lakehouse architectures, streaming pipelines, and batch ETL processes.
- Strong executive communication and stakeholder management skills.
- Experience in Informatica or similar enterprise ETL tools.
- Familiarity with AI-native tooling (Builder.io, Cursor, etc.) and modern data ecosystems.
- Ability to prototype, experiment, and drive execution independently.
- Prior experience managing migrations from legacy ETL to Databricks or AWS-based lakehouse systems.
- Exposure to enterprise internal data platforms and Fin Ops cost governance.
- Deep understanding of semantic models, analytical consumption layers, and modern BI tools.
- A combination of technical fluency and delivery excellence.
- The ability to bridge the gap between data engineers, architects, and business stakeholders.
- A forward-thinking mindset — blending data, AI, and product innovation.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×