×
Register Here to Apply for Jobs or Post Jobs. X

Technical Program Manager, Data Lake

Job in Columbia, Boone County, Missouri, 65201, USA
Listing for: Ansira
Full Time position
Listed on 2026-02-12
Job specializations:
  • IT/Tech
    Data Science Manager, Data Engineer, Data Analyst, Business Systems/ Tech Analyst
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Technical Program Manager, Data Lake

At Ansira, we are consolidating and scaling a unified, enterprise Data Lake to integrate product, media, and business data, standardize reporting, and accelerate decision‑making across the organization. We are seeking a Technical Program Manager (TPM) to lead this cross‑functional program end‑to‑end — aligning product and engineering roadmaps, driving ingestion and migration from legacy systems, maturing data governance and quality, and ensuring business adoption of standardized, self‑serve analytics.

This leader will orchestrate work across Ansira’s Product solutions, Data Engineering, Data Science/BI, Media, and Client Partnership teams, with a clear mandate: deliver consistent, governed, and performant data to downstream products and reporting while deprecating redundant systems and minimizing operational cost.

What You’ll Do Program Leadership & Delivery
  • Own the multi‑quarter program plan for the unified Data Lake: scope, roadmap, milestones, budgets (OPEX/CAPEX), risks, and dependencies.
  • Stand up and run the operating model: weekly workstream standups, cross‑functional syncs, monthly steering committee, and a transparent executive status rhythm.
  • Build and maintain a single‑source‑of‑truth for delivery: program charter, RACI, RAID log, decision log, intake/triage process, and dashboards for progress/risks.
  • Drive the migration plan from legacy pipelines and tools (e.g., Alteryx, Insighter) to the target stack (e.g., Snowflake, Power BI embedded via platform connectors).
  • Coordinate parallel work streams (ingestion, modeling, governance, reporting cutover) to hit time‑bound deliverables with predictable quality.
Product Management & Roadmap
  • Define and maintain the Data Lake program backlog, translating business use cases into technical epics, data contracts, and acceptance criteria.
  • Partner with Product and Data Science teams to standardize media and product reporting packages and ensure they’re backed by governed, contract‑driven data.
  • Prioritize sources and domains for ingestion based on business value, client impact, and technical feasibility; establish clear go/no‑go gates.
  • Align with platform architecture to ensure scalable patterns for batch/stream ELT/CDC, cost control, observability, and reusability across domains.
Data Governance, Quality, and Security
  • Establish practical data contracts with upstream product and business owners; define schema, SLAs, lineage, and DQ checks at ingestion.
  • Stand up governance ceremonies and roles (data owners, stewards) and implement data catalog/lineage practices to improve discoverability and trust.
  • Define and monitor quality KPIs (completeness, timeliness, accuracy) and drive remediation plans with accountable teams.
  • Ensure data privacy, compliance, and security best practices (e.g., PII handling, role‑based access, data masking) across environments.
Stakeholder Management & Change Adoption
  • Serve as the connective tissue across Product, Engineering, Data Science, Media, Finance, and Client Partnership — communicating decisions, trade‑offs, and timelines.
  • Lead change management for reporting standardization (e.g., Media (AdTech/LBN)-based standard reports), business onboarding to the lake, and client‑facing cutovers.
  • Create enablement assets (runbooks, playbooks, onboarding guides) and training plans to accelerate adoption and reduce support burden.
Technical Fluency
  • Partner effectively with architects and data engineers on Snowflake/Big Query/Databricks, Azure/AWS/GCP services, orchestration (ADF/Airflow), and transformation (dbt).
  • Understand ELT/CDC patterns, API/file ingestion, schema design for analytics, and BI tooling (Power BI, Looker). Write and review basic SQL for validation.
  • Apply Fin Ops and performance/cost optimization practices (storage tiers, compute sizing, job scheduling, caching strategies).
Minimum Qualifications
  • 8+ years in Program/Project/Product Management, with 5+ years leading complex data platform initiatives in a cloud environment.
  • Proven delivery of cross‑functional data programs involving multiple product lines and business stakeholders; strong executive communication.
  • Hands‑on experience with modern data…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary