This article delves into various SLAM algorithms.
We'll describe each, highlight key features, then provide a detailed comparison, culminating in a comprehensive table.
Hope you enjoy it👌
Slam Toolbox
Slam Toolbox is a 2D SLAM tool for lifelong mapping in large maps with ROS. It supports serialization, optimization solvers, and map merging, benchmarked for areas up to 200,000 sq. ft. It's production-ready with high performance in synchronous/asynchronous modes.
Slam Toolbox: https://github.com/SteveMacenski/slam_toolbox
Fast-LIO2
Fast-LIO2 is a efficient LiDAR-inertial odometry package using iterated Kalman filters for robust navigation in noisy environments. It supports various LiDARs and IMUs, with incremental mapping at over 100Hz, ideal for UAVs and UGVs.
Fast-LIO2: https://github.com/hku-mars/FAST_LIO
ORB-SLAM3
ORB-SLAM3 is an accurate library for visual, visual-inertial, and multi-map SLAM with monocular/stereo/RGB-D cameras. It handles pin-hole/fisheye models and is robust in real-time on powerful hardware, outperforming others in accuracy.
ORB-SLAM3: https://github.com/UZ-SLAMLab/ORB_SLAM3
RTAB-Map
RTAB-Map's ROS2 package supports stereo, RGB-D, and 3D LiDAR for mapping and navigation. It's compatible with ROS2 Humble+, with examples for robots like TurtleBot, focusing on sensor integration.
RTAB-Map: https://github.com/introlab/rtabmap_ros (ROS2 branch)
Gmapping
Gmapping is a particle filter-based 2D SLAM algorithm in ROS, common for mapping with non-linear sensors. It's straightforward for indoor navigation but can suffer from drift in larger spaces.
Gmapping: https://github.com/ros-perception/slam_gmapping
RESPLE
RESPLE is an open-source continuous-time odometry framework that models 6-DoF motion trajectories using B-splines in a recursive Bayesian estimator. It supports LiDAR odometry (LO), LiDAR-IMU odometry (LIO), and multi-sensor setups, emphasizing smoother and more accurate motion estimation without traditional error-state complexities. Demonstrated on datasets like HelmDyn for dynamic human motions and R-Campus for outdoor mapping, it excels in real-time performance across platforms like aerial, ground, and wearable robots.
RESPLE (LinkedIn): https://www.linkedin.com/posts/ali-pahlevani_lidar-slam-odometry-activity-7322656056151592960-XlmG
RESPLE: https://github.com/asig-x/RESPLE
KISS-SLAM
KISS-SLAM (Keep It Small and Simple) is a LiDAR-only 3D SLAM system focused on robust localization and mapping. It uses LiDAR odometry for relative motion, local map construction, and pose graph optimization to correct drift. With minimal parameter tuning, it achieves state-of-the-art pose accuracy and operates faster than sensor frame rates. It's adaptable to diverse environments, as shown in demonstrations like parking lot loops.
KISS-SLAM (LinkedIn): https://www.linkedin.com/posts/ali-pahlevani_kissabrslam-kissabricp-slam-activity-7309573385015955456-gayt
KISS-SLAM: https://github.com/PRBonn/kiss-slam
MOLA
MOLA (Modular Optimization framework for Localization and Mapping) is a flexible SLAM system with a mix-and-match design, allowing users to assemble components like Lego blocks. It generates task-specific maps (e.g., for obstacle avoidance) and enhances LiDAR tracking for fast movements without extra tools. Compatible with ROS2, it's demonstrated with 3D-LiDAR data for accurate mapping in robotics applications.
MOLA (LinkedIn): https://www.linkedin.com/posts/ali-pahlevani_robotics-slam-ros2-activity-7303428831271387137-w5H2
MOLA: https://github.com/MOLAorg/mola
ROMAN
ROMAN (Robust Object Map Alignment Anywhere) is a view-invariant global localization method using open-set object mapping. It incorporates object geometry, shape, and semantics for robust loop closure in complex environments, especially with opposing viewpoints. As a ROS2 package, it improves multi-robot trajectory estimation and is useful for collaborative tasks.
ROMAN (LinkedIn): https://www.linkedin.com/posts/ali-pahlevani_robotics-ros2-localization-activity-7283867982580043776-EZ81
ROMAN: https://github.com/mit-acl/ROMAN
BALM
BALM (Bundle Adjustment for LiDAR Mapping) adapts computer vision's Bundle Adjustment to optimize 3D-LiDAR SLAM. It focuses on edge and plane features, using adaptive voxelization and octree structures for efficiency. Running at 10Hz for odometry and 2Hz for map refinement, it's robust for real-time point cloud processing in robotics.
BALM (LinkedIn): https://www.linkedin.com/posts/ali-pahlevani_lidar-slam-computervision-activity-7280515516056690688-bQqy
BALM: https://github.com/hku-mars/BALM
Kinematic-ICP
Kinematic-ICP is a LiDAR odometry method incorporating kinematic constraints for wheeled robots. It balances inputs from wheel odometry, IMU, and LiDAR for smooth, precise motion in degenerate environments. Outperforming wheel-IMU setups in simulations, it's ideal for accurate localization in warehouses or outdoor spaces.
Kinematic-ICP (LinkedIn): https://www.linkedin.com/posts/ali-pahlevani_ros-ros2-gazebo-activity-7270434904444727296-FE-e
Kinematic-ICP: https://github.com/PRBonn/kinematic-icp
MINS
MINS is an extended version of OpenVINS, an inertial-based platform with loop closure, online calibration, and support for asynchronous sensors. Tested on datasets like KAIST urban with multi-sensor fusion (GNSS, wheels, LiDAR, IMU, cameras), it's computationally demanding but robust for complex environments.
MINS (LinkedIn): https://www.linkedin.com/posts/ali-pahlevani_mins-vio-sensorfusion-activity-7249460309315244032-BEMh
MINS: https://github.com/rpng/MINS (extended from OpenVINS: https://github.com/rpng/open_vins)
Fast-SLAM (1.0/2.0)
Fast-SLAM uses particle filters for grid-based 2D SLAM in simulated environments. It handles static binary grids with beam sensors, providing visualizations for predefined scenes. It's educational for AI courses but limited to specific maps.
Fast-SLAM: https://github.com/yingkunwu/FastSLAM
Cartographer
Cartographer offers real-time 2D/3D SLAM across platforms and sensors. It's versatile but no longer actively maintained, with limited support for new developments.
Cartographer: https://github.com/cartographer-project/cartographer
OpenVINS
OpenVINS is a filter-based visual-inertial estimator with on-manifold sliding window Kalman filters. It's robust for drones and AR/VR, supporting features like online calibration, but lacks full mapping (VIO-focused).
OpenVINS: https://github.com/rpng/open_vins
Comprehensive Comparison
SLAM algorithms vary in sensor reliance, computational demands, and environmental suitability.
LiDAR-based ones like RESPLE, KISS-SLAM, and Fast-LIO2 excel in structured outdoor/indoor settings with high accuracy, but higher costs, while,
Visual ones like ORB-SLAM3 and OpenVINS are cost-effective for dynamic scenes, but sensitive to lighting.
Particle filter approaches (e.g., Fast-SLAM, Gmapping) are simple and handle non-linearity well, but scale poorly with map size.
Modular systems like MOLA and Slam Toolbox offer flexibility for customization, ideal for ROS-integrated robots.
In terms of efficiency, Fast-LIO2 and KISS-SLAM run at high frequencies, suiting real-time applications, whereas MINS and ORB-SLAM3 demand powerful hardware.
Accuracy is highest in tightly-coupled systems like RESPLE and ORB-SLAM3, especially with IMU fusion, but degenerates in featureless areas without multi-sensor support.
Common algorithms like Cartographer and Gmapping are beginner-friendly and widely used in ROS, while,
Specialized ones like ROMAN shine in multi-robot scenarios.
Overall, choice depends on environment (indoor vs. outdoor), sensors, scale, accuracy needed, and cost.