×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Salem, Marion County, Oregon, 97308, USA
Listing for: Agility Robotics
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Data Analyst, Robotics
Salary/Wage Range or Industry Benchmark: 120000 - 150000 USD Yearly USD 120000.00 150000.00 YEAR
Job Description & How to Apply Below

Salem, OR or San Francisco Bay Area or Pittsburgh, PA

Agility Robotics is a pioneer. Our robot, Digit, is the first to be sold into workplaces across the globe. Our team is differentiated by its expertise in imagining, engineering, and delivering robots with advanced mobility, dexterity, intelligence, and efficiency -- robots specifically designed to work alongside people, in spaces built for people. Every day, we break through engineering challenges and invent new solutions and capabilities that will one day make robots commonplace and approachable.

This work is our passion and our responsibility: our mission is to make businesses more productive and people’s lives more fulfilling.

About The Team

Agility Robotics is building the future of work through humanoid robots that operate in human environments. The Data Platform team builds the data infrastructure that powers everything from fleet operations and hardware reliability to business analytics and machine learning. We enable engineers across robotics, perception, and product teams to derive insight from the vast quantities of telemetry and log data generated by our robots in the field.

About

The Role

We are looking for a Senior Data Engineer to join our Data Platform team and help shape the foundation of data-driven operations  this role, you’ll work closely with robot software and hardware teams(among others) to design, curate, and maintain high-quality datasets that enable analytics, debugging, and fleet-scale insights.

You’ll bridge the gap between raw robot data and actionable information — working both on-robot data generation and in the cloud ingestion and processing pipelines. You’ll design transformations, author pipelines, and collaborate across teams to deliver reliable and queryable data products for hardware reliability, system health, workflow metrics, and root cause analysis.

What You’ll Do
  • Collaborate with robot software and hardware teams to define, collect, and curate data needed for analytics and debugging.
  • Develop and maintain ETL pipelines that transform raw robot logs and telemetry into structured datasets using Spark, Airflow (or equivalent orchestration tools), and AWS data services.
  • Contribute to on-robot data production workflows to ensure high-fidelity, well-structured data capture.
  • Design derived datasets and transformations across Avro, Parquet, and other sensor data formats to power fleet operations, reliability analysis, and business metrics.
  • Implement data quality checks, schema evolution, and metadata management practices using our internal Data Registry and cataloging systems.
  • Work closely with the ingestion and storage services that move robot data into the cloud (S3-based data lake).
  • Collaborate with internal consumers — reliability, analytics, and ML teams — to design efficient data models for their workflows.
  • Occasionally contribute to shared libraries or APIs in Python, Java, or C++ to support data capture and consumption.
What We’re Looking For

Required:
  • 5+ years of experience as a Data Engineer or similar role building and maintaining production data pipelines.
  • Strong proficiency in Apache Spark or equivalent distributed data processing frameworks.
  • Experience with Airflow, Dagster, Prefect, or other data orchestration systems.
  • Proficiency with data formats such as Avro, Parquet, and structured/numeric datasets.
  • Solid understanding of data modeling, schema evolution, and data quality best practices.
  • Good intuitions of how to model datasets logically and partition them physically for optimal query performance, both for analytical query engines and for playback or root-cause-analysis(e.g. ReRun, Foxglove etc)
  • Strong programming skills in Python, Java and/or Scala.
  • Experience with AWS data stack (S3, Glue, Athena, EMR, etc.) or similar cloud infrastructure.
  • Experience working with vision data pipelines(e.g. Images, video, depth) and building derived datasets from them.
  • Comfort working cross-functionally with software, hardware, and analytics teams in a fast-paced environment
Nice to Have:
  • Experience with robotics vision data (RGB, depth, point clouds, or perception outputs) and how to process, store, and query them efficiently.
  • Fa…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary