More jobs:
Data Engineer
Job in
Englewood, Arapahoe County, Colorado, 80151, USA
Listed on 2026-02-07
Listing for:
Residex®
Full Time
position Listed on 2026-02-07
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst
Job Description & How to Apply Below
Key Responsibilities
- Design and build data products in Snowflake that serve as the foundation for analytics, dashboards, and operational insights—treating data models as intentional products with clear interfaces, documentation, and performance SLAs that enable downstream teams to build with confidence
- Architect scalable ETL/ELT pipelines extracting data from production SQL Server databases and transforming it into analytics-ready data products in Snowflake, ensuring reliability, accuracy, and usability across thousands of healthcare communities
- Collaborate deeply with the Data UI/UX Developer and domain experts to understand how data will be consumed and design data models that anticipate analytics needs, reduce friction, and enable self-service exploration rather than just meeting immediate requirements
- Translate business logic embedded in legacy application code and stored procedures into maintainable, well‑documented data layer transformations using modern tools such as dbt, ensuring business rules are accurate, auditable, and positioned as reusable data products
- Build dimensional data models in Snowflake including star and snowflake schemas that balance query performance, analytical flexibility, and maintainability—designing data structures that empower rather than constrain downstream analytics development
- Champion data product thinking by establishing clear data contracts, semantic definitions, and quality guarantees that give BI developers, analysts, and business users confidence in the data they're building on
- Implement Dev Ops best practices for data pipelines including version control (Git), CI/CD automation, infrastructure as code, monitoring, and alerting to ensure data products are deployed reliably and evolve safely as requirements change
- Establish and maintain data quality frameworks including validation rules, reconciliation processes, and automated testing to ensure analytics products meet healthcare industry accuracy standards and data consumers can trust the foundation they're building on
- Document data lineage, transformation logic, and business rules using tools like Dataedo or equivalent data catalog platforms, creating living documentation that helps downstream teams understand what data means, where it comes from, and how to use it effectively
- Work closely with the Data Architect to implement warehouse architecture decisions including schema design, indexing strategies, partitioning, and query optimization that support sub‑second dashboard response times and enable scalable self‑service analytics
- Optimize pipeline performance and cost efficiency in Snowflake through query tuning, materialized views, clustering, and efficient data loading patterns while maintaining the usability and accessibility of data products
- Engage with data consumers (Data UI/UX Developer, analysts, domain experts) to gather feedback on data product usability, identify pain points in data access or structure, and continuously evolve data models to better serve their workflows
- Support data governance initiatives including implementing access controls, audit logging, and HIPAA‑compliant data handling practices for protected health information (PHI) while ensuring appropriate data discoverability and access for authorized users
- 5+ years of experience building production data pipelines and data products at scale, with demonstrated ability to design data models that serve downstream analytics and reporting needs effectively
- Strong data product mindset—you understand that data engineering isn't just about moving data, it's about creating reliable, well‑documented, usable data assets that enable others to build analytics products that improve how people work
- Deep SQL development skills including complex queries, window functions, CTEs, stored procedures, and performance optimization for both SQL Server (source) and Snowflake (target)
- Hands‑on experience with Snowflake architecture including warehouses, databases, schemas, stages, streams, tasks, and understanding of Snowflake‑specific optimization techniques
- Strong proficiency with ETL/ELT tools, with dbt strongly preferred for transformation logic,…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×