Develop and implement a set of navigation and sensor fusion algorithms (GPS, inertial Measurement Unit (IMU) navigation systems, encoders, laser tracking sensors) for bare-metal and real-time operation systems for ARM Cortex M architectures and for post-processing. Perform debugging of the systems in real-time and in post-processing, including identifying errors and correcting issues, unexpected crashes, and performance fallbacks. Develop, implement and debug navigation algorithms for mobile mapping systems using Global Navigation Satellite System (GNSS), laser scanner, inertial navigation systems, and video cameras measurements. Create and implement the calibration software algorithms for different sets of sensors, including stereo cameras, laser scanners, laser tracking devices, inertial measurement units and high precision GNSS systems. Develop object recognition and classification and detection algorithms based on data streams from the laser scanners and video images. Evaluate and compare different inertial navigation systems, as required, and provide the full life cycle of GNSS and IMU software development, from requirement analysis and implementation to long-term support. Evaluate different components of the V2V software stack, working with different IEEE 802.11, IEEE 1609 and SAE J2735 protocol implementations. Perform raw data analysis for different sensors, such as GNSS, IMU, encoders and video sensors, as well as integrate GNSS and IMU data to generate a trajectory to ensure high accuracy IMU and DMI odometry and GNSS measurements for positional accuracy and reliability using knowledge of GNSS products and IMU’s. Perform Kalman Filter parameters optimization using knowledge of Extended Kalman Filters and perform mobile mapping systems support using knowledge of programming languages (C++, Java, Python, Julia language, CMAKE), tools, frameworks and IDE’s (QT, Microsoft Visual Studio, SciLab, Atom), and knowledge of mobile mapping software. Create software to perform tasks such as creating and visualizing point clouds, as well as to provide geo-referenced, time-stamped, and imagery analysis for company’s fully integrated high density 3-D laser scanning and digital imagery mobile mapping system. Generate point cloud using LIDAR data to ensure company’s software will post-process the geo-referenced Laser Scanning (LIDAR) and/or digital imaging data into viewable, 3-D image representation that can then be exported to industry standard forms. Will perform systems engineering support using mobile mapping, surveying and data collection software equipment. Will work as a team member on developing the GNSS, IMU, LIDAR and Video streaming devices and modules, focusing in Windows/Linux, communication subsystems (communication interfaces, input/output data streams), CAN stacks and drivers, and the integration of firmware parts provided by other teams. Develop, implement and debug algorithms for autonomous construction machines navigation and control. Modify and customize drivers for specific usage and maintain the software build environment (multi-CPU server, create and update software build scripts for different sensor fusion and data analysis applications).
Two (2) years of experience in job offered or two (2) years of experience working with Global Navigation Satellite System (GNSS) products and Inertial Measurement Unit (IMU) receivers and a Master’s Degree in Robotics or Electrical Engineering.
Please copy and paste your resume in the email body do not send attachments, we cannot open them and email them at candidates at placementservicesusa.com with reference #117753 in the subject line.
Salary not provided. Benefits include Retirement Plan, Health, Dental, Vision