×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineer

Job in Phoenix, Maricopa County, Arizona, 85003, USA
Listing for: Adelante Healthcare
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below

POSITION SUMMARY

The Data Engineer will build scalable, trusted, and secure data models and pipelines that support analytics, reporting, and operations across our healthcare organization. The Data Engineer plays a critical part in delivering validated, governed, and actionable data products that drive decision-making throughout the business.

This position involves end-to-end ownership of data workflows, including ingestion, transformation, modeling, and delivery. It requires strong proficiency in SQL and Python, as well as deep experience with Azure data services such as Synapse, Data Factory, Logic Apps, and Microsoft Fabric. The Data Engineer is responsible for applying continuous validation at every stage of the process and maintaining clear, collaborative communication with both stakeholders and technical team members to ensure data accuracy, alignment, and reliability.

EXPECTATIONS

Every Adelante Healthcare employee will strive to maximize their performance and contribution to Adelante Healthcare and the community we serve every day. Employees are expected to work in a manner that demonstrates a commitment to quality, patient safety, employee engagement, innovation, and the highest standards of personal integrity, professionalism, and competence.

OUR CORE VALUES
  • Inclusion
  • Nurture
  • Service
  • Purposeful
  • Integrity
  • Resilient
  • Engaged
Qualifications ESSENTIAL SKILLS AND EXPERIENCE
  • Bachelor’s degree in computer science, Information Systems, Engineering, or a related technical field, preferred.
  • Three (3) – five (5) years of experience
  • Strong proficiency in SQL for large-scale data transformation, query optimization, and schema development.
  • Advanced Python skills for scripting ETL/ELT logic, validation routines, automation, and integration.
  • Demonstrated ability to design and implement scalable, maintainable data pipelines using Azure Data Factory, Synapse Pipelines, and Logic Apps.
  • Expertise in data modeling, including normalized and dimensional models (star/snowflake), surrogate keys, SCD handling, and business-friendly schema design.
  • Experience implementing robust data validation and quality frameworks, including completeness, accuracy, integrity, and anomaly detection.
  • Proficient with Azure Synapse Analytics (dedicated and serverless pools), Azure Data Lake Storage, and Logic Apps for managing ingestion, transformation, and orchestration.
  • Working knowledge of data security practices and compliance (HIPAA, PHI), including RBAC, RLS, and secure data delivery.
  • Familiarity with CI/CD pipelines and infrastructure-as-code using Azure Dev Ops or Git Hub Actions.
  • Ability to deliver trusted, analytics-ready datasets to business teams via Microsoft Fabric Lake houses and Power BI.
  • Experience supporting Power BI development by preparing validated semantic models, implementing row-level security, and enabling performance-optimized report layers — always backed by solid pipeline foundations.
Other common essential skills / experience you may choose to include
  • Certification to perform cardiopulmonary Resuscitation for the Health Care Professional (CPR) and AED through courses that follow the guidelines from the American Heart Association and Red Cross (cognitive and skills evaluations)
  • Valid Level One Fingerprint Clearance Card issued by the Arizona Department of Public Safety for all specialty behavioral health locations or ability to obtain within 30 days of employment
  • Prioritization and multi-task skills are required
  • Competency in working with people of various cultures
  • Ability to perform a variety of assignments requiring considerable exercise of independent judgment
POSITION REPONSIBILITIES
  • Build and maintain robust data pipelines that ingest, transform, validate, and deliver data from multiple internal and external sources.
  • Design data models that align with business use cases and support high-performance reporting and analytics.
  • Implement data validation logic to ensure quality, trust, and accuracy in all downstream outputs.
  • Automate workflows using Logic Apps and Data Factory to support real-time and batch use cases.
  • Ensure pipeline observability, logging, and monitoring for proactive issue detection and resolution.
  • Collaborate with…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary