Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion
Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion
Blog Article
This paper presents an enhanced Simultaneous Localization and Mapping (SLAM) framework for mobile robot navigation.It integrates RGB-D cameras and 2D LiDAR sensors to improve both mapping accuracy and localization efficiency.We propose a data fusion strategy where RGB-D point clouds are projected into 2D and denoised alongside LiDAR data.
Late fusion is applied to combine the processed data, making it ready for use in the SLAM system.Additionally, we propose the enhanced Gmapping (EGM) algorithm by adding adaptive resampling and degeneracy handling to address particle depletion issues, thereby improving click here the robustness of the localization process.The system is evaluated through simulations and a small-scale real-world implementation using a Tiago robot.
In simulations, the system was tested in environments of varying complexity and compared against state-of-the-art methods such as RTAB-Map SLAM and our EGM.Results show general improvements in navigation compared to state-of-the-art approaches: in simulation, an 8% reduction in traveled distance, a 13% reduction in processing time, and a 15% improvement in goal completion.In iphone 14 price san francisco small-scale real-world tests, the EGM showed slight improvements over the classical GM method: a 3% reduction in traveled distance and a 9% decrease in execution time.