×
Register Here to Apply for Jobs or Post Jobs. X

Test Data Analyst Seattle

Job in Seattle, King County, Washington, 98127, USA
Listing for: Overland AI Inc
Full Time position
Listed on 2025-12-20
Job specializations:
  • IT/Tech
    Data Analyst, Cybersecurity, Data Security, IT Support
Job Description & How to Apply Below
Position: Test Data Analyst New Seattle

Founded in 2022 and headquartered in Seattle, Washington, Overland AI is transforming land operations for modern defense. The company leverages over a decade of advanced research in robotics and machine learning, as well as a field-test forward ethos, to deliver combined capabilities for unit commanders. Our Over Drive autonomy stack enables ground vehicles to navigate and operate off‑road in any terrain without GPS or direct operator control.

Our intuitive Over Watch C2 interface provides commanders with precise coordination capabilities essential for mission success.

Overland AI has secured funding from prominent defense tech investors including 8VC and Point 72, and built trusted partnerships with DARPA, the U.S. Army, Marine Corps, and Special Operations Command. Backed by eight‑figure contracts across the Department of Defense, we are strengthening national security by iterating closely with end users engaged in tactical operations.

Role Summary

Overland AI is hiring a Test Data Analyst to develop and maintain a first‑order, data‑driven understanding of how our autonomous vehicles behave in real‑world testing. This role sits within the Systems, Safety, and Test (SST) organization and partners closely with software, hardware, and test teams to turn daily field test outputs into reliable insight that improves autonomy performance, safety, and system maturity.

This role is centered on deep, hands‑on analysis of field test data. You will spend your time immersed in autonomy runs, synchronized logs, ROS MCAPs, sensor outputs, and recorded test video — building deep intuition for system behavior by repeatedly reviewing the same routes and scenarios over time. This sustained exposure and consistent analysis enables you to not only annotate and tag data but identify subtle patterns, regressions, and improvements that are not visible through metrics alone.

You will be embedded in the test workflow, translating observed behavior into structured datasets, high‑quality issue reports, and clear test summaries. Your work forms the factual record of system behavior that engineering, leadership, and customers rely on to assess readiness and risk in demanding defense environments.

This role sits at the intersection of autonomy testing, data analysis, and systems thinking, with a strong emphasis on accuracy, traceability, and clarity over speed.

Key Responsibilities Primary Responsibility:
Field Test Data Review & Behavior Analysis
  • Perform deep review of autonomy field test data, including synchronized video, ROS MCAPs, telemetry, and sensor outputs
  • Build strong familiarity with system behavior by analyzing repeated routes and scenarios across changing software and hardware configurations
  • Annotate autonomy behavior, anomalies, and decision‑making moments with precise timestamps and contextual notes
  • Identify subtle deviations, trends, and regressions that emerge through longitudinal analysis rather than single test runs
  • Identify, classify, and document hardware, software, and system‑level behaviors observed during autonomy testing
  • Own the quality of issue reporting by producing, reviewing, and enriching bug reports with clear context, timestamps, and supporting evidence
  • Track and trend system behavior across repeated routes, environments, and software/hardware releases to identify regressions and improvements
  • Analyze recurring anomalies (e.g., odometry stability, localization consistency, planner decisions) using longitudinal test data
  • Perform structured analysis to identify contributing factors across autonomy software, vehicle systems, sensing, and operations
  • Support issue prioritization by providing data‑backed context that distinguishes isolated events from systemic risk
Test Reporting & Evidence Development
  • Generate clear, structured test summaries that synthesize large volumes of data into conclusions and recommendations
  • Contribute traceable evidence to support hazard analysis, validation activities, and future certification efforts
  • Help define repeatable standards and formats for test reporting as the organization scales
Data Visibility & Communication
  • Transform raw test data and analysis into visual, consumable…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary