The Parallel, Real-Time Visual Simultaneous Localization and Mapping (SLAM) System is a piece of software that creates a map or maps of the environment a stereo camera is moved throughout. These maps are 3D point clouds of the environment along with the pose of the camera within the environment. Threaded architecture (of differential feature tracking, relative pose estimation, and loop detection/correction) allows the system to continue exploration of new areas as the map is global corrected using a non-linear minimization. SLAM has commercial applications for GPS denied mapping of unknown environments (search and rescue, military applications) and motion planning for autonomous robots.
• The system exploits the inherent parallelism in the visual SLAM problem by separating differential feature tracking, relative pose estimation and loop detection/correction into separate threads of execution
• The system uses a combination of differentially tracked KLT features which are tracked at frame rate or faster with SIFT features which are only extracted for key-frames. Key-frames are selected based on optical flow calculated from the KLT features
• Limits drift over long sequences