Engineer DataOps
Listed on 2026-02-16
-
IT/Tech
Data Engineer
Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them.
Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself.
*** Applicants must be authorized to work for any employer in the U.S. We are unable to sponsor or take over sponsorship of an employment visa at this time, including CPT/OPT.***
The Data Ops Engineer will own the Data Ops lifecycle for our Snowflake-on-AWS platform: from contract-first design and CI/CD, to observability, quality, release management, and incident response. You’ll turn data products into reliable services with SLAs/SLOs (freshness, accuracy, completeness, timeliness), automate promotion across environments, and hard-wire governance (PII tagging, masking, RBAC) so trusted data ships fast—and safely.
What you will do:Data product lifecycle & contracts
Define and enforce data contracts (schemas, SLAs/SLOs, versioning, deprecation) for batch/streaming products; guard against breaking changes.
Maintain a schema registry/contract repo and promotion workflow (dev → test → prod) with automated checks and approvals.
Standardize environment topologies, seed/test data, and release calendars to reduce lead time and change failure rate.
Engineer idempotent pipelines using Streams/Tasks, Snowpipe/Kafka, and orchestration (Airflow/Dagster/Step Functions/Lambda).
Publish runbooks and SLOs for datasets/jobs (freshness, latency, failure rate); run capacity planning and game days.
Implement the data test pyramid: column/row checks, anomaly detection, reconciliation, and end-to-end validation.
Build monitoring/alerts from / and pipeline metadata (QUERY/LOAD/ACCESS history); wire alerts to on-call with clear ownership and auto-ticketing.
Automate PII classification and object TAGS; enforce tag-based masking, row access policies, RBAC role families, and network policies.
Ensure lineage and glossary links (Collibra/Open Lineage) are updated on every release; produce audit evidence on demand.
Lead data incident triage (bad/missing/late data), customer comms, RCAs, and post-incident hardening.
Operate change control with impact analysis, blast-radius limits, and progressive delivery (canary/backfills).
Track queries, warehouse utilization, and job cost; implement guardrails (rightsizing, auto-suspend hygiene, query tagging/chargeback).
Recommend workload placement (Snowflake vs. adjacent engines) balancing SLA, quality, and cost.
- Snowflake
- dbt
- Terraform (Snowflake provider)
- Git Hub/Git Lab/Azure Dev Ops
- Airflow/Dagster/Step Functions/Lambda
- Python/Bash
- Kafka/Kinesis/Snowpipe
- ACCOUNT/ORG Usage views
- Collibra/Open Lineage
- Tableau/Power BI/Qlik
- Cloud Watch/Datadog/Splunk
Education: Bachelor’s in Computer Science, Information Systems, Data/Analytics, or related; equivalent practical experience welcomed.
Experience: 5–8+ years in data engineering/analytics platform roles with 3+ years operating Snowflake in production.
Data Ops skills: You’ve shipped contract-first pipelines, automated tests, and environment promotion at scale; you measure success with SLIs/SLOs and error budgets.
Snowflake depth: Warehouses, Streams/Tasks, Snowpipe/Kafka Connector, search optimization, materialized views, replication/failover; strong SQL and performance tuning.
Automation: Terraform (Snowflake provider), dbt (models/tests/docs), Git Hub/Git Lab/Azure Dev Ops;
Python/Bash for tooling and checks.Observability: Building alerts/dashboards from ACCOUNT/ORG usage views; experience with data…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).