×
Register Here to Apply for Jobs or Post Jobs. X

Software Engineer, Data Engineer, Data Analyst

Job in Denver, Denver County, Colorado, 80285, USA
Listing for: Montegallo.
Full Time position
Listed on 2026-02-14
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Security, Data Science Manager
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time.

We are building a modern business intelligence platform that enables member organizations to analyze their performance while also providing a consolidated view across the network. Our experienced data team is partnering with a consulting group to deliver a scalable platform built on a modern data stack, including Snowflake, dbt, Dagster, and Power BI.

We are seeking a Software Engineer with a strong data engineering focus to design, build, and operate the ingestion layer of this platform. This role centers on developing reliable, scalable Python-based data pipelines that load data into Snowflake via APIs, database replication, and web automation/scraping, with a strong emphasis on quality, observability, and maintainability.

What You’ll Do
  • Design, build, and own Python-based data ingestion pipelines into Snowflake using Airflow, Dagster, or Prefect
  • Ingest data from APIs, databases (replication/CDC), and Selenium-based web scraping
  • Serve as the ingestion domain expert, owning pipeline reliability, performance, and edge cases
  • Implement data quality checks, monitoring, and alerting to ensure trusted data
  • Develop internal data applications using Snowflake Streamlit
  • Collaborate with analytics, BI, and business stakeholders to support evolving data needs
  • Contribute to dbt models as needed and help enforce engineering best practices
  • Create and maintain clear technical documentation
  • Monitor cloud and Snowflake usage and help manage platform costs
What We’re Looking For
  • Bachelor’s degree in Computer Science, Engineering, or a related field
  • Demonstrated experience building Python-based data pipelines using Airflow, Dagster, or Prefect
  • Hands‑on experience ingesting data via APIs, database replication/CDC, and web scraping/automation
  • Strong SQL skills and experience with Snowflake or similar cloud data warehouses
  • Familiarity with containerization (Docker, Kubernetes) and CI/CD workflows
  • Understanding of data modeling, ELT/ETL principles, and modern data engineering practices
  • Strong communication skills and ability to manage multiple initiatives
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary