ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

rtabmap_ros generates a distorted map when using the filtered odometry by robot_localization ekf node

asked 2022-02-16 10:01:25 -0500

SahanGura gravatar image

Platform - Raspberry pi4 with ubuntu 18.04 + ros melodic

I am trying to generate a 3D map using rtabmap using visual odometry and IMU data. In order to fuse IMU data with visual odometry, I use ekf node of the robot_localization package. This is my implementation.

  • RGBD data ---> rgbd_node of rtabmap ---> visual odometry (/vo)
  • IMU raw data ---> Madgwick filter ---> orientation estimation in ENU frame (/imu/data)

  • (/vo) + (/imu/data) ---> ekf_node ---> filtered_odometry (/odometry/filtered)

  • (/odometry/filtered) ---> rtabmap ---> map_data

But, the resultant map from the rtabmap is distorted with repeated segments in different positions. This happens due to an issue of the filtered odometry from the ekf node. When I feed visual odometry calculated by rtabmap_rgbd_odometry_node (/vo) to rtabmap node, map is generated without any distortion.

The whole purpose of integrating the IMU data with visual odometry is to generate the map with better accuracy. But, due to this issue of the filtered odometry, map gets distorted. What is the best approach to make this work?

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
0

answered 2022-03-01 21:54:11 -0500

SahanGura gravatar image

An update regarding the above issue

When the rgbd_odometry node loses the odometry, EKF cannot understand it and it tries to predict odometry using the state transition model without measurement data (in case of IMU has not been used). Therefore the generated map starts to distort, from the moment the first odometry loss happened. There should be a way to communicate this odometry losing moment to the EKF, so it can stop predicting inaccurate odometry values. That is what happens when we feed visual odometry generated by rgbd_odometry node to the rtabmap node directly. rtabmap node identifies when odometry is lost, and it stops mapping until odometry is properly given.

edit flag offensive delete link more

Comments

See my comment on your post here: https://answers.ros.org/question/2215...

matlabbe gravatar image matlabbe  ( 2022-03-06 22:34:12 -0500 )edit
0

answered 2022-02-20 16:01:32 -0500

matlabbe gravatar image

I would suggest to find a third party VIO (visual inertial odometry) approach and feed its output to rtabmap, you will get a better fusion of vision and imu. Other option is to feed the imu data to rgbd_odometry with wait_imu_to_init=true, that will give a loosely coupled VIO approach.

edit flag offensive delete link more

Comments

Thank you for your suggestion. At this moment I am unable to provide VIO directly to the rtabmap. My target is to generate VIO odometry using EKF of the robot localization package.

And I have checked the rtabmap_ros package parameters of the odometry nodes. But I could not find a parameter wait_imu_to_init to deal with IMU data when visual odometry is generated. According to my understanding about the rtabmap_ros package, there is no way to use IMU data with RGBD data inside the package. That is why I am trying to get accurate odometry values using EKF of the robot localization package

SahanGura gravatar image SahanGura  ( 2022-03-01 21:51:31 -0500 )edit

Question Tools

1 follower

Stats

Asked: 2022-02-16 10:00:41 -0500

Seen: 216 times

Last updated: Mar 01 '22