×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineer

Job in Phoenix, Maricopa County, Arizona, 85003, USA
Listing for: PrePass
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

About Pre Pass

Pre Pass® is North America's most trusted weigh station bypass and toll management platform. We're transforming how the transportation industry operates—creating solutions that keep trucks moving safely, efficiently, and compliantly. This means making bold decisions and building systems that support not only fleets but the broader economy. It all starts with enabling commercial vehicles to keep rolling with seamless toll management, weigh station bypass, and safety solutions.

It's what we do best, and we do it to meet the demands of the road every day.

That's why people join us: our solutions are implemented in real‑time, on highways and interstates across the nation, helping fleets go farther, faster. This work challenges and rewards, presenting complex problems that need ambitious answers. We hire bold thinkers with a heart for impact, a passion for progress, and the optimism to shape the future of transportation.

About

The Role

We're looking for a skilled Data Engineer to join our team in the transportation sector. In this role, you'll work with modern cloud technologies to design, build, and maintain data pipelines that support analytics, reporting, and operational insights. You'll be part of a highly collaborative product engineering team focused on delivering reliable, scalable data solutions that enable smarter decision‑making across the organization.

This is a hybrid position based in our downtown Phoenix office.

This role operates within an Agile environment using Scrum, with strong XP engineering practices. The team emphasizes small, frequent deliveries, continuous improvement, Test Driven Development, and shared ownership. You will collaborate daily with product managers, QA, and fellow engineers through standups, backlog refinement, sprint planning, reviews, and retrospectives. Strong verbal and written communication skills are essential, as active participation in design discussions, pairing, and problem‑solving is a core part of how the team operates.

This is a great opportunity for someone with solid experience in backend data systems who enjoys solving real‑world problems, working in fast feedback loops, and contributing to continuously evolving data platforms.

Key Responsibilities Data Platform & Pipeline Development
  • Design, develop, and maintain cloud‑native data pipelines using Databricks, Microsoft Azure Data Factory, and Microsoft Fabric to support data integration and analytics solutions.
  • Implement incremental and real‑time data ingestion strategies using a medallion architecture for data lake storage.
  • Write, optimize, and maintain complex SQL queries to transform, integrate, and analyze data across enterprise systems.
  • Develop solutions with a focus on scalability, maintainability, testability, and long‑term operability within a continuous delivery mindset.
Data Operations & Reliability
  • Support and troubleshoot legacy data platforms built on SSIS and SQL Server, ensuring high availability and performance of critical data processes.
  • Identify, troubleshoot, and resolve data integration and data quality issues to ensure reliable production data delivery.
  • Contribute to observability, automated validation, and CI/CD pipelines to support fast feedback and safe releases.
Agile Collaboration & Engineering Practices
  • Collaborate daily with product owners, QA, and engineers through Scrum ceremonies including standups, backlog refinement, sprint planning, sprint reviews, and retrospectives.
  • Participate in proof‑of‑concept efforts, technical spikes, and design discussions, providing thoughtful technical analysis and pragmatic recommendations.
  • Apply XP engineering practices such as pairing, incremental delivery, continuous refactoring, and shared code ownership to maintain high‑quality, evolvable systems.
  • Clearly communicate technical concepts, risks, tradeoffs, and progress to both technical and non‑technical stakeholders.
Requirements

Required Qualifications
  • 5+ years of experience designing and building data solutions.
  • Strong proficiency in SQL and Python for data analytics and transformation.
  • Hands‑on experience with ETL pipeline development and automation.
  • Solid understanding of data…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary