ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Integrating an IMU with rtabmap

asked 2016-11-14 18:32:47 -0500

natejgardner gravatar image

My rtabmap is having a very hard time with keeping track of orientation during both localization and mapping. I'm wondering if there is any easy way to feed in IMU data so that the direction of gravity can help orient the images and produce better results. That is, I'm wondering if I can associate orientation data with every frame such that it improves rtabmap's SLAM capabilities.

edit retag flag offensive close merge delete

1 Answer

Sort by » oldest newest most voted
1

answered 2016-11-15 13:24:53 -0500

matlabbe gravatar image

Related topics:

As mentioned in one of these posts, robot_localization package could be used to do a loosely sensor fusion of IMU and visual odometry. The launch file sensor_fusion.launch is a simple example combining Kinect+IMU with rtabmap.

cheers

edit flag offensive delete link more

Comments

HI! I was just checking this sensor_fusion.launch and is it correct to assume that it is just combining position information from odom0 and orientation information from imu0 (if argument imu_ignore_acc is set)? wouldnt be better to set true the orientation parameters of odom0_config? rookie question

Danilo_BR gravatar image Danilo_BR  ( 2017-11-14 12:20:47 -0500 )edit
1

1-Yes. 2-It depends: covariance should be carefully tuned if you want to fuse orientations of both sensors. The example is like assuming that IMU orientation covariance is a lot smaller than vo.

matlabbe gravatar image matlabbe  ( 2017-11-14 14:22:32 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2016-11-14 18:32:47 -0500

Seen: 3,904 times

Last updated: Nov 15 '16