Is RTAB-Map SLAM possible with my robot's configuration?
Hello,
I am trying to design a basic four-wheeled differential drive robot that is capable of autonomous navigation using a static 2D map built from the RTAB-Map SLAM algorithm. I plan on using an Intel Realsense D435 RGB-D camera along with a 360 LIDAR sensor for the environment mapping and odometry. I do not have access to wheel encoders that would also contribute to odometry tracking for this project, but from my understanding, it may still be possible to accomplish my goal without them. If this is true, could anyone point me in the direction of documentation that would be useful in accomplishing this task. The robot is using a Raspberry Pi 4 to interface with both the LIDAR and RGB-D camera. Both the Raspberry Pi 4 and the separate remote computer that is interfacing with the robot are operating using ROS Noetic on Ubuntu 20.04. Any help would be greatly appreciated.
Asked by Devin1126 on 2022-12-26 02:39:22 UTC
Answers
There are multiple options depending on the environment. If wheel odometry cannot be computed accurately, you may rely on lidar odometry alone (see example here). If you have a wheel odometry not too bad but drifting quite fast, you may add an extra IMU to do sensor fusion for better wheel odometry. You could then use that config instead for lidar odometry (this would also help for lidar deskewing).
Either the approach above used, you can feed the D435 images for global loop closure detection (global localization). I recommend to feed stereo IR images (or IR-Depth) without IR emitter on instead of RGB-D streams to rtabmap to avoid blurry images. See examples on same page than above.
Asked by matlabbe on 2022-12-30 14:06:27 UTC
Comments
Thank you! I will check out your proposed solution as well.
Asked by Devin1126 on 2022-12-30 14:27:50 UTC
Comments
Unfortunately this appears to be a duplicate of #q410954.
Asked by gvdhoorn on 2023-01-02 06:41:06 UTC