×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Remote / Online - Candidates ideally in
Tempe, Maricopa County, Arizona, 85285, USA
Listing for: Dutch Bros Coffee
Remote/Work from Home position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below

It's fun to work in a company where people truly believe in what they are doing. At Dutch Bros Coffee, we are more than just a coffee company. We are a fun-loving, mind-blowing company that makes a difference one cup at a time.

Position Overview

The Senior Data Engineer for Dutch Bros IT Data Management & Engineering organization plays a critical role in designing, developing & optimizing Data Management Pipelines for our Enterprise Data Platforms within our organization, ensuring that our data infrastructure is robust, flexible, and aligned with our strategic goals. The Senior Data Engineer contributes to data modeling, upholding data pipeline standards and best practices in shaping the technology backbone of our data ecosystem.

They are responsible for designing, building, and maintaining the data cloud platforms that enable us to collect, store, process, and analyze vast amounts of data efficiently and effectively.

Job Qualifications
  • 10+ years in implementing data-intensive solutions using agile methodologies.
  • Proficiency in programming languages for data engineering, especially Python and SQL and other Scripting languages like Spark, Poweshell.
  • 2+ years of hands-on experience with Snowflake (Snowpipe, SnowSQL, replication, data-sharing) for production-grade data solutions.
  • 3+ years in analytic data modeling, including dimensional modeling and Data Vault 2.0.
  • Experience with data lake implementation, database design, and SQL on platforms like Oracle or SAP HANA.
  • Real-time ETL/ELT and Big Data pipeline experience, especially with Kafka, Workato, ADF, Matillion, and DBT.
  • Hands-on experience with Databricks, Cloud storage, and/or S3.
  • Good Understanding of Cloud-native development and container orchestration (e.g., Serverless, Docker, Kubernetes) and cloud platforms (AWS, Azure) and Devops fundamentals supporting Unit testing, CI/CD and repository management.
  • Familiarity with table and file formats (Iceberg, Hive, Avro, Parquet, JSON).
  • Familiarity with data privacy regulations knowledge (GDPR, CCPA) and Attribute-based data security.
  • Familiarity with Advanced Analytics support: building pipelines and frameworks for Machine Learning.
  • Skilled in handling structured and unstructured data and real-time data processing.
  • Snowflake or Databricks administration knowledge, preferred.
  • Knowledge of No

    SQL and Redis-type databases, beneficial.
  • Knowledge and experience with Data Governance - Data Quality, Metadata management, Data Security, Data Archive is preferred.
  • Strong problem-solving, analytical, and critical-thinking skills with a focus on business insights.
  • Excellent communication for conveying technical concepts to non-technical stakeholders.
  • Proven ability to work collaboratively in cross-functional teams.
  • Adaptable and proactive in learning new technologies in a dynamic data environment.
  • Strong time management skills to handle multiple projects and ensure high-quality deliverables.
  • Familiarity with both waterfall and agile/iterative delivery methodologies.
  • Food and Beverage experience, preferred.
Location Requirement

This role is located in Tempe, Arizona. This position is required to be in office 4 days per week (Mon‑Thurs);
Fridays are optional remote work days.

Key Result Areas (KRAs)

Assess, design, build, and maintain the data pipelines that enable Dutch Bros teams to collect, store, process, and analyze vast amounts of data efficiently and effectively:

  • Seamless Snowflake Implementation:
    Successfully develop and deploy production‑grade data solutions on Snowflake, Databricks or similar tools. Optimized Data Pipelines:
    Design and enhance ETL/ELT pipelines and data transformations for efficiency, supporting advanced analytics and real‑time data processing with technologies such as Kafka, Fivetran, Databricks, and Matillion or leveraging PySpark.
  • High‑Quality Data Models:
    Implement dimensional and data products to improve data accuracy and accessibility, enabling comprehensive ad‑hoc analysis, reporting, and visualization.
  • Advanced Analytics Support:
    Develop and maintain data engineering frameworks that support machine learning workflows and complex data analyses, ensuring that data is readily available for actionable…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary