×
Register Here to Apply for Jobs or Post Jobs. X

Principal Data Engineer

Job in Draper, Salt Lake County, Utah, 84020, USA
Listing for: Journeyteam
Full Time position
Listed on 2026-02-18
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 120000 - 150000 USD Yearly USD 120000.00 150000.00 YEAR
Job Description & How to Apply Below

Journey Team’s Data & AI practice is growing, and we’re looking for a hungry, humble, and smart Principal Data Engineer to join our world‑class team. This is a great opportunity for someone with 8+ years of experience who’s ready to take the next step—building modern Microsoft data solutions and working directly with clients. If you’re passionate about solving real business problems with data and want to be part of a collaborative, values‑driven team, we’d love to hear from you.

About

Journey Team

At Journey Team, people are at the center of everything we do. Our purpose as a company is to help others effectively use technology to create a positive, lasting impact on the world. With 30 years of technology experience, we are 100% focused on delivering Microsoft business applications and technologies that empower organizations to reach new heights of business success. We deeply understand the transformative value of Microsoft solutions and are dedicated to helping our customers unlock their full potential.

Our experienced team specializes in driving success across Dynamics 365, Microsoft 365, AI and Copilot, Azure, modern data solutions—all leveraging Microsoft’s comprehensive security platform.

Responsibilities

As a Principal Data Engineer, you will be a thought leader and trusted advisor to our clients, driving innovation, technical excellence, and measurable business outcomes. You will own the design, engineering, and oversight of complex data solutions within the Microsoft ecosystem, while mentoring team members and shaping Journey

TEAM’s Data & AI practice.

  • Architect and engineer data platforms using Microsoft Azure and Fabric, including lakehouse, medallion, dimensional warehouse, and real‑time architectures.
  • Own end-to-end data platform design across ingestion, transformation, storage, semantic modeling, and governance.
  • Establish and enforce engineering standards for data pipelines, modeling practices, testing frameworks, CI/CD, infrastructure‑as‑code, and deployment processes.
  • Design scalable, secure, high‑performance data systems optimized for reliability, cost efficiency, observability, lineage, and lifecycle management.
  • Lead architectural decision‑making across client engagements, ensuring technical consistency, long‑term maintainability, and alignment with enterprise best practices.
  • Build, review, and optimize complex production‑grade pipelines and transformations using Azure Data Factory, Databricks, Synapse, Fabric, SQL, and Python.
  • Optimize distributed processing workloads (e.g., Spark) and high‑volume SQL environments for performance and scalability.
  • Serve as the senior technical escalation point for complex engineering challenges and distributed system troubleshooting.
  • Translate business requirements into durable, scalable technical architectures grounded in strong engineering principles.
  • Lead technical design sessions, architecture reviews, and solution workshops with both technical and executive stakeholders.
  • Stay current with Microsoft’s evolving Data & AI ecosystem and drive pragmatic adoption of emerging capabilities.
Qualifications
  • 8+ years of progressive experience in data engineering, including at least 5 years designing and implementing enterprise‑scale data platforms on Microsoft Azure.
  • Proven experience architecting and engineering modern data platforms (lakehouse, medallion, dimensional warehouse, batch and real‑time processing).
  • Deep hands‑on expertise with Azure data services, including Data Factory, ADLS, Synapse Analytics, Databricks, Azure SQL Database, Event Hubs, Stream Analytics, Logic Apps, and Microsoft Fabric.
  • Demonstrated ability to design, build, and optimize systems that support high‑volume, high‑velocity, and distributed workloads.
  • Advanced proficiency in SQL and Python, with experience designing, reviewing, and optimizing complex production‑grade codebases.
  • Strong command of dimensional modeling (star/snowflake schemas), semantic modeling, and enterprise data architecture patterns.
  • Experience implementing CI/CD pipelines, Dev Ops practices, and infrastructure‑as‑code for modern data platforms.
  • Expertise in performance tuning, cost optimization, scalability planning, and…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary