×
Register Here to Apply for Jobs or Post Jobs. X

Perception Engineer

Job in Ogden, Weber County, Utah, 84403, USA
Listing for: Autonomous Solutions
Full Time position
Listed on 2026-02-12
Job specializations:
  • Engineering
    Robotics, Systems Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

About Us

At Autonomous Solutions, Inc. (ASI), we are a global leader in vehicle automation solutions. Our technology enables safe, efficient, and scalable automation for industries such as solar, agriculture, construction, landscaping, and more. With a commitment to innovation and excellence, ASI continues to push the boundaries of what is possible in autonomous technology. Our Mission is to help you reach your potential through innovative robotic solutions.

We pride ourselves in our core values:
Safe, Simple, Transparent, Growth, Humble, and Attention to Detail.

About the Role

We are hiring Perception Engineers across multiple levels (I-V) to join our autonomy team. Perception Engineers at ASI are responsible for designing, implementing, and deploying real-time perception systems that help unmanned ground vehicles (UGVs) interpret their surroundings and make intelligent decisions. The role includes sensor evaluation, sensor fusion, tracking, object detection, and deploying code to production systems.

This position requires proficiency in C++11 or newer, strong sensor data processing skills (especially LiDAR), and the ability to develop robust, real-time algorithms. Perception Engineers collaborate with teams across ASI, including Embedded Software, Planning, Controls, and Systems Engineering, and work on platforms that operate in diverse and challenging environments. While the role currently includes 50% field deployment work, our goal is to shift toward 90% development.

This is a hybrid position that requires three days (Tuesday through Thursday) on-site.

Job Duties
  • Develop and deploy real-time perception algorithms using LiDAR, radar, cameras, and ultrasonic sensors.
  • Design and implement classical perception systems including sensor fusion, object tracking, and feature extraction.
  • Contribute to machine learning-based perception pipelines as appropriate to project needs.
  • Write clean, efficient C++ code optimized for embedded Linux environments.
  • Integrate perception software with existing robotic platforms using ROS2 or custom middleware.
  • Support field deployment and troubleshooting of perception systems (currently ~50% field work).
  • Collaborate with architecture, planning, and control teams to ensure consistent interface design.
  • Participate in simulation, HIL testing, and field validation to ensure system robustness.
  • Analyze system performance and improve perception robustness in GPS-denied and adverse conditions.
  • Work with the architecture team to provide feedback on module standards and interfaces.
Level Breakdown Perception Engineer I
  • 0-2 years of experience.
  • Bachelor's degree in Computer Science, Robotics, Electrical Engineering, or related field.
  • Familiarity with sensor data processing and basic C++ programming.
  • Works under guidance to integrate and test perception systems.
Perception Engineer II
  • 2-4 years of experience.
  • Bachelor's or Master's degree.
  • Solid skills in C++11, sensor processing (e.g., LiDAR), and algorithm implementation.
  • Contributes to both classical and ML-based perception tasks with some independence.
Perception Engineer III
  • 4-6 years of experience.
  • Master's degree preferred.
  • Leads development of specific perception features or modules.
  • Works across multiple projects and contributes to deployment architecture.
Perception Engineer IV
  • 6+ years of experience.
  • Master's degree required.
  • Responsible for complex perception challenges including fusion across heterogeneous sensors.
  • Provides mentorship and guidance to junior engineers.
Perception Engineer V
  • 8+ years of experience.
  • Recognized expert in perception for autonomous systems.
  • Sets technical direction for perception architecture and strategy.
  • Leads high-impact projects with multiple stakeholders.
Requirements Required
  • Bachelor's degree required for Level I;
    Master's preferred/required for Levels II-V.
  • Proficiency in C++11 or newer.
  • Experience processing data from LiDAR, radar, or camera systems.
  • Familiarity with embedded Linux development.
  • Strong understanding of linear algebra and mathematical modeling.
  • Ability to contribute to deployment-ready, high‑confidence field systems.
Preferred
  • Experience with ROS2, GPU processing, and embedded ML applications.
  • Background…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary