×
Register Here to Apply for Jobs or Post Jobs. X

Senior Engineer, Software - Perception; R3771

Job in Boston, Suffolk County, Massachusetts, 02298, USA
Listing for: The Rundown AI, Inc.
Full Time position
Listed on 2025-12-23
Job specializations:
  • Engineering
    AI Engineer, Systems Engineer, Robotics, Aerospace / Aviation / Avionics
Salary/Wage Range or Industry Benchmark: 200000 - 250000 USD Yearly USD 200000.00 250000.00 YEAR
Job Description & How to Apply Below
Position: Senior Engineer, Software - Perception (R3771)

Founded in 2015, Shield AI is a venture-backed defense technology company with the mission of protecting service members and civilians with intelligent systems. Its products include the V-BAT aircraft, Hivemind Enterprise, and the Hivemind Vision product lines. With offices in San Diego, Dallas, Washington, D.C., Boston, Abu Dhabi (UAE), Kyiv (Ukraine), and Melbourne (Australia), Shield AI’s technology actively supports U.S. and allied operations worldwide.

For more information, visit (Use the "Apply for this Job" box below). Follow Shield AI on Linked In,X, You Tubeand Instagram.

This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments. A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.

Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.

What you'll do:
  • Develop advanced perception algorithms — Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.
  • Implement sensor fusion frameworks — Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.
  • Develop state estimation capabilities — Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.
  • Analyze and utilize sensor ICDs — Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.
  • Optimize perception performance — Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.
  • Support autonomy integration — Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.
  • Validate in simulated and operational settings — Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.
  • Collaborate with hardware and sensor teams — Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.
  • Drive innovation in airborne sensing — Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.
  • Travel Requirement – Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).
Required Qualifications:
  • BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience
  • Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience.
  • Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.
  • Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.
  • Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.
  • Ability to interpret and work with Interface Control…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary