Data Platform Engineer
Listed on 2026-02-16
-
IT/Tech
Data Engineer, Data Analyst
Location: Hybrid in Denver, CO (2 days/week on-site)
Pay Rate: $50-60/hour (flexible depending on experience and qualifications)
Type: 6-Month Contract (with potential for extension)
Our client is building a modern business intelligence platform designed to help member organizations analyze their individual performance while also providing a consolidated, network-wide view of data. They are partnering with an experienced data team and a consulting group to deliver a scalable platform built on a modern data stack, including Snowflake, dbt, Dagster, and Power BI.
As part of this initiative, our client is seeking a Software Engineer with a strong data engineering focus to help design, build, and operate the ingestion layer of the platform. This role is centered on developing reliable, scalable, Python-based data pipelines that load data into Snowflake through APIs, database replication, and web automation/scraping, with a strong emphasis on data quality, observability, and long-term maintainability.
WhatYou’ll Do
- Design, build, and own Python-based data ingestion pipelines into Snowflake using orchestration tools such as Airflow, Dagster, or Prefect
- Ingest data from APIs, databases (replication/CDC), and Selenium-based web scraping and automation
- Act as the ingestion subject‑matter expert, owning pipeline reliability, performance, and edge‑case handling
- Implement data quality checks, monitoring, and alerting to ensure trusted, reliable data
- Develop internal data applications using Snowflake Streamlit
- Partner with analytics, BI, and business stakeholders to support evolving data needs
- Contribute to dbt models as needed and help enforce data engineering best practices
- Create and maintain clear, well‑structured technical documentation
- Monitor cloud infrastructure and Snowflake usage, contributing to cost management and optimization
- Bachelor’s degree in Computer Science, Engineering, or a related discipline
- Proven experience building Python-based data pipelines using Airflow, Dagster, or Prefect
- Hands‑on experience ingesting data via APIs, database replication/CDC, and web scraping or automation
- Strong SQL skills and experience working with Snowflake or comparable cloud data warehouses
- Familiarity with containerization and deployment workflows (Docker, Kubernetes, CI/CD)
- Solid understanding of data modeling, ELT/ETL patterns, and modern data engineering practices
- Strong communication skills and the ability to manage multiple initiatives concurrently
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).