Senior Data Engineer
Listed on 2026-02-14
-
IT/Tech
Data Engineer, Cloud Computing, Data Analyst
About Torus
Torus is headquartered in Utah and is expanding manufacturing at our 540,000-square-foot facility in Salt Lake City called Giga One. Our mission is to build the world’s first mesh energy infrastructure — built to unite people and communities through resilient, secure, and intelligent power. We design, engineer, manufacture, install, and support our systems end-to-end, standing behind them throughout their lifecycle. Torus systems help reduce costs, lower emissions, and protect facilities from outages, while strengthening the security and reliability of the broader utility grid.
Torus is committed to American manufacturing, engineering excellence, and building energy systems that last. At Torus, you will be part of something larger than a single product or technology. Your work will help build energy infrastructure that supports critical systems, industry, and communities for decades to come. We value accountability, collaboration, and clear thinking. We are looking for people who want to solve hard problems and build things that matter.
The Role
Senior Data Engineer – build and scale the data infrastructure that powers Torus’s mission. As a core member of our data team, you’ll build and maintain our modern data stack (dbt, Redshift, Airflow, Fivetran, Metabase, and Streamlit), designing and implementing scalable data pipelines that ingest, transform, and serve data from our complex ecosystem of IoT devices, grid systems, and business applications.
You’ll collaborate on and support the infrastructure that enables our data team to operate efficiently, build real‑time data pipelines, create robust ETL workflows for data quality and reliability, and develop tools that make data accessible to both technical and non‑technical stakeholders. Your work directly enables machine learning models, analytics dashboards, and business intelligence that drive strategic decisions.
- Autonomous and ownership‑oriented: thrive with autonomy and end-to-end ownership; strong technical judgment across the data stack; enjoy total ownership of your domain.
- Collaborative architect: collaborate on architectural design and contribute to technical direction; effective both independently and as part of a team; share knowledge and provide candid feedback.
- Builder mindset: write elegant, maintainable code; pick up new technologies efficiently; build infrastructure and tools that empower data scientists, analysts, and business users.
- Quality‑focused: detail‑oriented with strong data intuition and passion for data quality and reliability.
- Adaptable startup operator: thrive in a startup environment with ambiguous requirements and rapidly changing priorities; energized by scaling data systems and shaping how the company uses data.
- Continuous learner: excited about learning new technologies and tackling unfamiliar problems; propose innovative solutions.
- Mission‑driven: passionate about using technology to combat climate change and transform how people consume energy.
- Design, build, and maintain scalable batch and streaming data pipelines that handle high‑volume IoT telemetry and business data.
- Develop robust ELT workflows to ingest, transform, and load data from diverse sources including APIs, databases, IoT devices, and third‑party systems.
- Build and optimize our data warehouse using Redshift, implementing dimensional models that support analytics and machine learning use cases.
- Implement real‑time data processing systems that enable immediate insights and rapid response to system events.
- Develop incremental SQL patterns in dbt for efficient data transformation.
- Build tools, processes, and pipelines to enforce, check, and manage data quality at scale.
- Develop monitoring and alerting systems to ensure pipeline reliability and data freshness.
- Create data validation frameworks and automated testing for data pipelines.
- Establish best practices for data governance, documentation, and lineage tracking.
- Build frameworks that enable data scientists to deploy models to production efficiently.
- Develop self‑service analytics…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).