Autonomy Engineer, Computer Vision
Listed on 2025-12-10
-
Engineering
Robotics, AI Engineer, Computer Science
Autonomy Engineer, Computer Vision – BRINC Drones
Join to apply for the Autonomy Engineer, Computer Vision role at BRINC Drones.
Base Pay Range$/yr - $/yr
About BRINCAt BRINC, we are redefining public safety with an innovative ecosystem of life‑saving tools. Our journey started with the development of drones and ruggedized throw phones, designed to access unsafe areas and establish communication to de‑escalate situations. Today, we’ve expanded into creating and deploying 911 response networks, where drones are dispatched to 911 calls to provide real‑time visual data, enhancing safety and enabling de‑escalation‑focused responses.
Our cutting‑edge solutions are used by over 600 public safety agencies across America and the company has raised over $150M from investors, including Index Ventures, Motorola Solutions, Sam Altman, Dylan Field, Mike Volpe, Alexandr Wang and more. We are committed to recruiting the world’s best talent to support first responders in saving lives. We are currently seeking skilled embedded software engineers to develop flight‑critical firmware, with a focus on advanced drone pilot assistance features.
this Role
We are seeking a Computer Vision Engineer to join our Autonomy engineering team to advance the visual perception and vision‑based autonomy capabilities of our UAVs and public safety products. You will design, implement, and optimize real‑time computer vision algorithms that enable robust localization, mapping, and visual navigation in challenging environments. In this role, you will work across VIO, VSLAM, depth and reconstruction pipelines, and visual scene understanding, collaborating closely with software, autonomy, controls, and hardware teams to bring vision algorithms into production UAV systems.
Key Responsibilities- Research, design, and implement vision‑based localization and mapping algorithms, including VIO and VSLAM.
- Develop real‑time computer vision pipelines for tracking, depth estimation, stereo/mono reconstruction, and dense/semi‑dense mapping.
- Architect and optimize vision‑centric sensor fusion systems combining cameras, IMUs, LiDAR, radar, and other sensors for robustness in diverse environments.
- Build perception algorithms enabling vision‑based navigation, including feature tracking, obstacle detection, and perception‑driven flight behaviors.
- Develop computer vision and machine learning models for scene understanding, object detection, and dynamic obstacle identification.
- Implement and optimize CV pipelines on embedded GPU or accelerator platforms, focusing on high performance and low latency.
- Validate perception and autonomy performance through simulation, hardware‑in‑the‑loop, and real‑world flight testing.
- Collaborate with cross‑functional teams to ensure seamless integration with autonomy, controls, mechanical, and firmware systems.
- Contribute to technical strategy, establish best practices, and help guide the evolution of the perception stack as the system scales and matures.
- Bachelor’s, Master’s, or PhD in Computer Science, Robotics, Electrical Engineering, or a related field with a minimum of 3+ years of industry experience. Note – we are considering candidates for mid‑career, senior, and principal positions.
- Strong programming skills in C++ and Python, with experience building real‑time systems.
- Experience developing computer vision or perception systems for robotics or UAVs, with a foundation in VSLAM, VIO, and/or related topics. Proficiency with standard frameworks and modern computer vision techniques.
- Familiarity with implementing CV models or pipelines on embedded systems, GPUs, or hardware accelerators.
- Hands‑on experience with robotics or UAV testing, including data collection, system debugging, and field validation.
- Deep knowledge of sensor fusion and tightly coupled vision–IMU systems.
- Experience with machine learning–based perception, including training and optimizing deep models for edge hardware.
- Background in vision‑based navigation, visual servoing, or perception‑driven autonomy.
- Strong understanding of real‑time systems, GPU optimization, or high‑performance computer vision.
- Familiarity with UAV safety,…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).