×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Frisco, Collin County, Texas, 75034, USA
Listing for: Keurig Dr Pepper Inc.
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below

Senior Data Engineer at Keurig Dr Pepper Inc.

Job Overview

Are you a hands‑on data engineer passionate about building scalable data pipelines and enabling analytics that drive business decisions? At Keurig Dr Pepper, we’re looking for a Senior Data Engineer to develop and optimize data workflows as part of our modern data ecosystem. You will work closely with product managers, data scientists, and analysts to build reliable, efficient data solutions that support enterprise reporting, self‑service analytics, and machine learning initiatives.

This is a high‑impact role within a fast‑moving, data‑driven team.

Responsibilities Data Engineering & Architecture
  • Design and implement scalable, well‑documented data pipelines using dbt, SnowSQL, and cloud‑native tools.
  • Develop transformation layers and dimensional models to support reporting, dashboards, and advanced analytics.
  • Build integrations with structured and semi‑structured data sources from internal and third‑party systems.
  • Apply data quality checks and validation frameworks to ensure accuracy, completeness, and reliability.
Optimization & Automation
  • Continuously monitor and optimize data pipelines for performance, scalability, and cost‑efficiency.
  • Contribute to code versioning, testing, and deployment automation using Git and CI/CD pipelines.
  • Leverage tools such as UC4, Airflow, Fivetran, or Informatica Cloud to orchestrate ELT workflows.
AI‑Ready Data Ecosystems
  • Collaborate with Data Scientists to support AI/ML pipelines—enabling efficient feature engineering, model training, and real‑time inferencing.
  • Integrate AI‑driven capabilities such as anomaly detection, intelligent alerting, and natural language enrichment into data workflows.
  • Prepare structured datasets to support experimentation, forecasting, and anomaly detection use cases.
Collaboration & Delivery
  • Work closely with product owners, data analysts, and other engineers to understand business requirements and translate them into data models and pipelines.
  • Participate in agile development processes—standups, backlog grooming, sprint planning, and retrospectives.
  • Contribute to data design discussions and ensure adherence to architectural and governance standards.
Leadership & Collaboration
  • Collaborate with cross‑functional teams—including Solution Architects, Product Managers, Data Scientists, and Analysts—to align on data strategy and business outcomes.
  • Technically lead product teams involving external partners, ensuring timely delivery of high‑quality data solutions.
  • Provide technical thought leadership, guiding product teams and best practices.
Governance & Optimization
  • Define and enforce engineering standards and best practices across the analytics ecosystem.
  • Lead architectural design reviews, ensuring technical rigor and adherence to change control processes.
  • Continuously assess and optimize the performance, reliability, and scalability of the data platform.
Who you are

You’re a strategic, hands‑on engineering leader who combines deep technical expertise with strong business acumen. You’re passionate about solving complex data challenges, enabling AI, and mentoring teams to deliver enterprise‑grade data solutions.

Key Skills & Expertise
  • Technical Mastery – Deep, hands‑on expertise expected.
  • Expert‑level knowledge of Snowflake architecture, SnowSQL, and data transformation workflows.
  • Advanced proficiency with dbt for modeling, testing, versioning, and orchestrating ELT pipelines.
  • Strong command of SQL, Python, and scalable data pipeline development.
  • Proven experience designing and managing enterprise data warehouses and cloud‑native data platforms, including Databricks.
  • Deep understanding of modern data modeling techniques (e.g., dimensional, data vault, star/snowflake schemas).
  • Experience of developing and managing ELT/ETL pipelines using tools like Fivetran, Informatica, or Azure Data Factory.
  • Familiarity with CI/CD practices, Git‑based workflows, and agile development methodologies.
  • Knowledge of AI/ML platforms such as Databricks, Sage Maker, AutoML, or Tensor Flow (preferred).
  • Experience with BI tools:
    Power BI, Tableau, or Micro Strategy.
  • Working knowledge of data governance practices (e.g., metadata management,…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary