Data Platform Engineer, Enterprise Data
Listed on 2025-12-28
-
IT/Tech
Data Engineer
Radiant is an El Segundo, CA-based startup building the world’s first mass-produced, portable nuclear microreactors. The company’s first reactor, Kaleidos, is a 1-megawatt failsafe microreactor that can be transported anywhere power is needed and run for up to five years without the need to refuel. Portable nuclear power with rapid deploy capability can replace similar-sized diesel generators, and provide critical asset support for hospitals, data centers, remote sites, and military bases.
Radiant’s unique, practical approach to nuclear development utilizes modern software engineering to rapidly achieve safe, factory-built microreactors that leverage existing, well-qualified materials. Founded in 2020, Radiant is on track to test its first reactor next year at the Idaho National Laboratory, with initial customer deliveries beginning in 2028.
We're building a new team to own Radiant's internal data infrastructure, executive analytics, and AI capabilities. As a Staff Data Platform Engineer, you'll be hands-on building the integrations and pipelines that connect Radiant's operational systems—Finance, HR, Supply Chain, Manufacturing, and Recruiting—into a unified data platform.
Your first priority will be the ION‑Ramp integration: building the pipeline that enables three-way matching (PO + Receipt + Invoice) between our manufacturing execution system and payments platform. This directly reduces manual work for the Finance team. From there, you'll expand to connect 9 enterprise systems, build dbt transformation models, and ensure data quality across the platform.
What You’ll Do- Build integrations: Configure Fivetran connectors, build custom pipelines, and connect 9 enterprise systems (ION, Ramp, Quick Books, Rippling, Ashby, Salesforce, SharePoint, Smartsheet, Docu Sign)
- Develop the ION‑Ramp pipeline: Build the Snowflake → Databricks → Ramp pipeline that enables three-way matching for Finance
- Write dbt models: Transform raw data into clean, documented, tested models for dashboards and analytics
- Ensure data quality: Implement testing, monitoring, and alerting for data pipelines; catch issues before they reach dashboards
- Support dashboards: Work with the PM and stakeholders to ensure the data models support Finance, Operations, and HR/Recruiting dashboards
- Document everything: Maintain clear documentation for pipelines, models, and data definitions
- 5+ years of data engineering or software engineering experience
- Python and SQL proficiency—you’ll write both daily
- Experience with data warehouses—Databricks, Snowflake, Big Query, or Redshift
- ETL/ELT pipeline development—Fivetran, Airbyte, Stitch, or custom pipeline experience
- API integration experience—REST, Graph
QL; you’re comfortable reading API docs and building integrations with enterprise SaaS tools - Data modeling skills—dimensional modeling, schema design, understanding of how data will be used downstream
- Background in hardware or hardtech companies—you understand manufacturing, supply chain, and physical product development
- Experience with dbt (data build tool)
- Experience with data quality frameworks and testing (Great Expectations, dbt tests, etc.)
- Comfortable building lightweight internal tools—Python/Flask, basic frontend—or using AI coding assistants (Cursor, Copilot) to extend your capabilities
- Familiarity with MES, ERP, or supply chain systems (ION, SAP, Net Suite, etc.)
- Mission: Clean energy that can go anywhere—few problems matter more
- Team: Work alongside exceptional people from Space
X, Blue Origin, and other top engineering organizations - Compensation: Competitive compensation with equity
- Benefits: Health/dental/vision, 401(k), and flexible PTO
- Impact: The pipelines you build directly reduce manual work for Finance and give leadership real‑time visibility
- Variety: You’ll work across 9 different systems—Finance, HR, Manufacturing, CRM, Documents—never boring
- Modern stack: Databricks, Fivetran, dbt, Airbyte—best-in‑class tools
This position requires the ability to work in the United States and eligibility for access to export‑controlled information under ITAR/EAR.
Total…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).