×
Register Here to Apply for Jobs or Post Jobs. X

Data Platform Engineer

Job in Denver, Denver County, Colorado, 80285, USA
Listing for: Interactive Resources - iR
Part Time, Contract position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 50 - 60 USD Hourly USD 50.00 60.00 HOUR
Job Description & How to Apply Below

Location: Hybrid in Denver, CO (2 days/week on-site)

Pay Rate: $50-60/hour (flexible depending on experience and qualifications)

Type: 6-Month Contract (with potential for extension)

Our client is building a modern business intelligence platform designed to help member organizations analyze their individual performance while also providing a consolidated, network-wide view of data. They are partnering with an experienced data team and a consulting group to deliver a scalable platform built on a modern data stack, including Snowflake, dbt, Dagster, and Power BI.

As part of this initiative, our client is seeking a Software Engineer with a strong data engineering focus to help design, build, and operate the ingestion layer of the platform. This role is centered on developing reliable, scalable, Python-based data pipelines that load data into Snowflake through APIs, database replication, and web automation/scraping, with a strong emphasis on data quality, observability, and long-term maintainability.

What

You’ll Do
  • Design, build, and own Python-based data ingestion pipelines into Snowflake using orchestration tools such as Airflow, Dagster, or Prefect
  • Ingest data from APIs, databases (replication/CDC), and Selenium-based web scraping and automation
  • Act as the ingestion subject‑matter expert, owning pipeline reliability, performance, and edge‑case handling
  • Implement data quality checks, monitoring, and alerting to ensure trusted, reliable data
  • Develop internal data applications using Snowflake Streamlit
  • Partner with analytics, BI, and business stakeholders to support evolving data needs
  • Contribute to dbt models as needed and help enforce data engineering best practices
  • Create and maintain clear, well‑structured technical documentation
  • Monitor cloud infrastructure and Snowflake usage, contributing to cost management and optimization
What They’re Looking For
  • Bachelor’s degree in Computer Science, Engineering, or a related discipline
  • Proven experience building Python-based data pipelines using Airflow, Dagster, or Prefect
  • Hands‑on experience ingesting data via APIs, database replication/CDC, and web scraping or automation
  • Strong SQL skills and experience working with Snowflake or comparable cloud data warehouses
  • Familiarity with containerization and deployment workflows (Docker, Kubernetes, CI/CD)
  • Solid understanding of data modeling, ELT/ETL patterns, and modern data engineering practices
  • Strong communication skills and the ability to manage multiple initiatives concurrently
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary