Post Job Free
Sign in

Perception Engineer - Sensor Fusion

Company:
Knightscope Inc.
Location:
Sunnyvale, CA, 94085
Posted:
March 28, 2026
Apply

Description:

About Knightscope Knightscope is a security technology company building the Nation's First Autonomous Security Force.

The Company combines autonomous machines, advanced software, and human expertise to help protect people, property, and critical infrastructure.

Knightscope's long-term mission is to make the United States of America the safest country in the world.

Location: Knightscope HQ, Sunnyvale, CA (This position is not remote) Job Summary: Knightscope is seeking a Perception Engineer with a strong AI/ML background to develop and improve the perception stack for our autonomous robotic platforms.

This role focuses on multi-sensor data fusion across cameras, LiDAR, radar, and ultrasonic sensors to enable robust environmental understanding and safe robot operation.

You will work closely with controls, simulation, and test engineers to deliver perception capabilities that directly impact robot safety, reliability, and performance in real-world deployments.

About the Role In this role, you will design, develop, and deploy perception algorithms that enable Knightscope robots to understand and navigate complex environments.

You will contribute across the perception stack, including 3D object localization, motion estimation, and tracking, while improving the quality of inputs consumed by downstream autonomy systems.

This role combines AI/ML model development, multi-sensor fusion, and real-world robotics integration, with a strong focus on performance, robustness, and deployment at scale.

Key Responsibilities * Design and develop multi-sensor perception algorithms using data from cameras, LiDAR, radar, and ultrasonic sensors * Build and improve sensor fusion pipelines for object detection, classification, tracking, and scene understanding * Develop algorithms for 3D object localization, motion estimation, and tracking in dynamic environments * Implement and optimize perception systems for real-time performance and reliability on robotic platforms * Work with large-scale datasets to train, evaluate, and deploy AI/ML models for perception tasks * Leverage fleet-scale data to evaluate perception performance, identify edge cases, and improve algorithms for robustness, accuracy, and real-world reliability * Define and track perception performance metrics and continuously improve system performance through data-driven iteration * Collaborate closely with controls, simulation, and test teams to ensure perception outputs support safe and reliable robot behavior * Support integration, debugging, and validation of perception systems on real robotic platforms * Contribute to tools, infrastructure, and workflows for perception development and evaluation Required Qualifications * S.

or Ph.D.

in Computer Science, Robotics, Electrical Engineering, Machine Learning, Computer Vision, or related field.

B.S.

with strong industry experience may be considered * Prior internship or full-time industry experience in robotics, autonomous systems, automotive, aerospace, ADAS, or related fields * Strong background in AI/ML, computer vision, and perception systems * Experience developing multi-sensor fusion algorithms for robotics or autonomous platforms * Experience with 3D perception, object detection, tracking, or scene understanding * Experience working with one or more sensing modalities: camera, LiDAR, radar, ultrasonic, or similar sensors * Strong programming skills in Python and/or C++ * Experience developing software in large, distributed development environments * Experience with version control systems (e.g., Git) and collaborative software development workflows (code reviews, branching, merging) * Experience training, evaluating, and deploying machine learning models in production or near-production systems * Ability to work in a hands-on robotics development and testing environment Preferred Qualifications * Experience with 3D perception methods (point clouds, voxel representations, BEV-based models, etc.) * Familiarity with modern ML approaches such as deep learning, transformers, or temporal models for perception * Experience with sensor calibration, synchronization, and validation * Familiarity with ROS / ROS2 * Experience with real-time inference and embedded or edge deployment constraints * Experience with simulation-based perception validation * Experience supporting downstream systems such as controls, planning, or navigation * Background in functional safety or autonomy system validation Compensation & Benefits * Base Salary: $150,000 to $210,000 (DOE) * Equity: Stock options * Benefits: Medical, dental, vision, 401(k), paid time off * Location Requirement: Full-time, on-site at Sunnyvale HQ

Apply