×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Engineer DataOps

Job in Denver, Denver County, Colorado, 80285, USA
Listing for: Empower
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them.

Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself.

*** Applicants must be authorized to work for any employer in the U.S. We are unable to sponsor or take over sponsorship of an employment visa at this time, including CPT/OPT.***

The Data Ops Engineer will own the Data Ops lifecycle for our Snowflake-on-AWS platform: from contract-first design and CI/CD, to observability, quality, release management, and incident response. You’ll turn data products into reliable services with SLAs/SLOs (freshness, accuracy, completeness, timeliness), automate promotion across environments, and hard-wire governance (PII tagging, masking, RBAC) so trusted data ships fast—and safely.

What you will do:

Data product lifecycle & contracts
  • Define and enforce data contracts (schemas, SLAs/SLOs, versioning, deprecation) for batch/streaming products; guard against breaking changes.

CI/CD & environment management
  • Maintain a schema registry/contract repo and promotion workflow (dev → test → prod) with automated checks and approvals.

  • Standardize environment topologies, seed/test data, and release calendars to reduce lead time and change failure rate.

Orchestration & reliability
  • Engineer idempotent pipelines using Streams/Tasks, Snowpipe/Kafka, and orchestration (Airflow/Dagster/Step Functions/Lambda).

  • Publish runbooks and SLOs for datasets/jobs (freshness, latency, failure rate); run capacity planning and game days.

Data quality & observability
  • Implement the data test pyramid: column/row checks, anomaly detection, reconciliation, and end-to-end validation.

  • Build monitoring/alerts from / and pipeline metadata (QUERY/LOAD/ACCESS history); wire alerts to on-call with clear ownership and auto-ticketing.

Governance-by-design (with DG & Security)
  • Automate PII classification and object TAGS; enforce tag-based masking, row access policies, RBAC role families, and network policies.

  • Ensure lineage and glossary links (Collibra/Open Lineage) are updated on every release; produce audit evidence on demand.

Incident & change management
  • Lead data incident triage (bad/missing/late data), customer comms, RCAs, and post-incident hardening.

  • Operate change control with impact analysis, blast-radius limits, and progressive delivery (canary/backfills).

Fin Ops & usage analytics
  • Track queries, warehouse utilization, and job cost; implement guardrails (rightsizing, auto-suspend hygiene, query tagging/chargeback).

  • Recommend workload placement (Snowflake vs. adjacent engines) balancing SLA, quality, and cost.

Tools you’ll likely use
  • Snowflake
  • dbt
  • Terraform (Snowflake provider)
  • Git Hub/Git Lab/Azure Dev Ops
  • Airflow/Dagster/Step Functions/Lambda
  • Python/Bash
  • Kafka/Kinesis/Snowpipe
  • ACCOUNT/ORG Usage views
  • Collibra/Open Lineage
  • Tableau/Power BI/Qlik
  • Cloud Watch/Datadog/Splunk
What you will bring:
  • Education: Bachelor’s in Computer Science, Information Systems, Data/Analytics, or related; equivalent practical experience welcomed.

  • Experience: 5–8+ years in data engineering/analytics platform roles with 3+ years operating Snowflake in production.

  • Data Ops skills: You’ve shipped contract-first pipelines, automated tests, and environment promotion at scale; you measure success with SLIs/SLOs and error budgets.

  • Snowflake depth: Warehouses, Streams/Tasks, Snowpipe/Kafka Connector, search optimization, materialized views, replication/failover; strong SQL and performance tuning.

  • Automation: Terraform (Snowflake provider), dbt (models/tests/docs), Git Hub/Git Lab/Azure Dev Ops;
    Python/Bash for tooling and checks.

  • Observability: Building alerts/dashboards from ACCOUNT/ORG usage views; experience with data…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary