Ask Your Question

Sensor fusion of IMU and ASUS for RTAB-MAP with robot_localization

asked 2016-08-25 07:00:43 -0600

MahsaP gravatar image

updated 2016-09-08 05:21:43 -0600


@matlabbe, I have ASUS xtion pro live and vn-100T IMU sensors like the following picture: image description

I tried to fuse the data from these two sensors similar to your launch file here with robot_localization. The difference in my launch file is that I added the tf relationship as:

<node pkg="tf" type="static_transform_publisher" name="base_link_to_camera_link_rgb"   args="0.0 0 0.0 0.0 0 0.0 base_link camera_link 20" />

<node pkg="tf" type="static_transform_publisher" name="base_link_to_imu_link" args="0.0 0.0 0.5 0 0 0 base_link imu 20" />

After I launch it, I get the following result for the map could. In this picture I am not moving sensors at all. It starts to rotate around the view, and the information is not correct.

image description

Also, when I move the sensors, the coordinate frames do not move correctly. I read "Preparing your sensor data", but I am not sure if something is wrong with the placement of the IMU or not!

Do you think there is a problem with the usage of the tf package? I think the relationship is correct, but I don't know how to remove this drift (or motion) when I don't move the sensors at all.

The rqt_graph and tf_frames are like the following: image description

image description

I also read these two question (1) and (2), but I didn't understand how to solve it. Thank you.

Edit: When I do rostopic echo /imu/imu, I get the following output:

  seq: 1912
    secs: 1472737002
    nsecs: 470006463
  frame_id: imu
  x: -0.989729464054
  y: -0.141599491239
  z: 0.0163822248578
  w: 0.0108099048957
orientation_covariance: [0.0005, 0.0, 0.0, 0.0, 0.0005, 0.0, 0.0, 0.0, 0.0005]
  x: 0.0242815092206
  y: -0.000830390374176
  z: -0.0146896829829
angular_velocity_covariance: [0.00025, 0.0, 0.0, 0.0, 0.00025, 0.0, 0.0, 0.0, 0.00025]
  x: 0.298014938831
  y: 0.262553811073
  z: 9.89032745361
linear_acceleration_covariance: [0.1, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.1]
edit retag flag offensive close merge delete


@Tom Moore would you please give me some hints on my question. Thank you.

MahsaP gravatar imageMahsaP ( 2016-09-08 05:19:59 -0600 )edit

Can you please post your launch file and sample messages for all inputs? Which IMU driver are you using? Is it reporting data in ENU frame? It looks like it must be, given your linear acceleration, but I want to be sure.

Tom Moore gravatar imageTom Moore ( 2016-09-13 03:18:08 -0600 )edit

@Tom Moore For the IMU I am using this driver on Ubuntu 14.04 with ROS indigo. Similar to the picture in the question, I put the IMU on top of the vision sensor. I tried to use ENU frame, but the positive direction for the IMU is NED

MahsaP gravatar imageMahsaP ( 2016-09-13 03:49:10 -0600 )edit

If I use NED, is it making a problem?

MahsaP gravatar imageMahsaP ( 2016-09-13 03:49:33 -0600 )edit

I put the launch files here. I will add some plots to the question for the outputs.

MahsaP gravatar imageMahsaP ( 2016-09-13 04:10:38 -0600 )edit


I am using your same IMU and Driver, and with unedited code my linear accelerations are the about the negative versions of yours. Did you have to modify the driver code so that z acceleration = +g when in the orientation of your picture? This is not a correction, but me trying to understand

matthewlefort gravatar imagematthewlefort ( 2016-09-13 17:14:15 -0600 )edit

@matthewlefort I just modified the code to add the covariance matrix.

MahsaP gravatar imageMahsaP ( 2016-09-14 08:37:44 -0600 )edit

If it's NED frame, then yes, that is definitely a problem. r_l assumes ROS standards (see REP-103), and an IMU that measures motion around the Z axis in the wrong direction is not going to work.

Tom Moore gravatar imageTom Moore ( 2016-09-20 02:29:05 -0600 )edit

2 Answers

Sort by » oldest newest most voted

answered 2016-10-09 04:47:50 -0600

Tom Moore gravatar image

Looking at the driver source, I think it's just using the data straight off the IMU. If it is, then the data is in NED frame, which does not follow the ROS standards, and is therefore not compatible with robot_localization. To fix it, you'll need to negate the Y and Z axes for everything the IMU produces, including the quaternion.

edit flag offensive delete link more

answered 2016-08-26 13:34:36 -0600

matlabbe gravatar image

updated 2016-09-01 09:58:48 -0600


There is maybe a TF problem. You can also look at the sensor_fusion_kinect_brick.launch, which defines the TF used in this example (note that freenect is started without publishing TF):

  <!-- IMU frame: just over the RGB camera -->
  <node pkg="tf" type="static_transform_publisher" name="rgb_to_imu_tf"
      args="-0.032 0.0 0.032 0.0 0.0 0.0 /camera_rgb_frame /imu_link 100" />

  <arg name="pi/2" value="1.5707963267948966" />
  <arg name="optical_rotate" value="0 0 0 -$(arg pi/2) 0 -$(arg pi/2)" />
  <node pkg="tf" type="static_transform_publisher" name="optical_rotation" 
    args="$(arg optical_rotate) /camera_rgb_frame /camera_rgb_optical_frame 100" />

The setup was similar to this (IMU just over the RGB camera, with x forward, z up and y left): image description

image description

Be aware that <node pkg="tf" type="static_transform_publisher" name="base_link_to_imu_link" args="0.0 0.0 0.5 0 0 0 base_link imu 20" />, it is 50 cm in Z!


The covariance for imu data should be set. Refer to the robot_localization guide. Here is my imu data:

  seq: 1426
    secs: 1472741597
    nsecs: 130712805
  frame_id: imu_link
  x: 0.0176402368309
  y: 0.0118415430629
  z: 0.010254532137
  w: 0.999755844473
orientation_covariance: [0.0005, 0.0, 0.0, 0.0, 0.0005, 0.0, 0.0, 0.0, 0.0005]
  x: 0.0010908307825
  y: -0.00218166156499
  z: 0.00218166156499
angular_velocity_covariance: [0.00025, 0.0, 0.0, 0.0, 0.00025, 0.0, 0.0, 0.0, 0.00025]
  x: 0.0
  y: 0.09
  z: 0.18
linear_acceleration_covariance: [0.1, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.1]

Is it drifting in orientation or position? In sensor_fusion.launch, IMU is used only for orientation data and visual odometry only for position data.


edit flag offensive delete link more


@matlabbe Thank you for your answer. When I do the same thing as you described I get the following warning:

[ WARN] [1472317843.569504300]: odometry: Could not get transform from odom to camera_rgb_frame (stamp=1472317843.414791) after 0.100000 seconds ("wait_for_transform_duration"=0.100000)!
MahsaP gravatar imageMahsaP ( 2016-08-27 12:12:22 -0600 )edit

It looks like robot_localization is not publishing TF or frame /camera_rgb_frame doesn't exist. based on your TF tree above, you should use /base_link frame as frame_id for visual odometry and rtabmap nodes.

matlabbe gravatar imagematlabbe ( 2016-08-27 18:28:49 -0600 )edit

@matlabbe I changed the axis of the imu by rotating imu on the camera similar to the picture you have. Also, I added base_link similar to what I have written in my question, but still I don't know why I don't get the correct answer. It has lots of drifts.

MahsaP gravatar imageMahsaP ( 2016-09-01 07:58:48 -0600 )edit

Can you update your question with the robot_localization configuration? In particular, are you using IMU acceleration? Can you also show ($ rostopic echo /imu/imu) an example of IMU message to see the covariance used?

matlabbe gravatar imagematlabbe ( 2016-09-01 08:19:30 -0600 )edit

I am using the config as here. I will put the output of the imu in the question.

MahsaP gravatar imageMahsaP ( 2016-09-01 08:40:18 -0600 )edit

Thanks I will try to set the covariance in the imu. The drifting is both in orientation and position. Maybe the problem has to do with the covariance.

MahsaP gravatar imageMahsaP ( 2016-09-01 10:50:47 -0600 )edit

I set the covariance to some values, but it is still the same.

MahsaP gravatar imageMahsaP ( 2016-09-01 16:22:31 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools



Asked: 2016-08-25 07:00:43 -0600

Seen: 1,517 times

Last updated: Oct 09 '16