ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
3

How to fuse IMU & visual odometry using robot_localization

asked 2015-11-26 10:32:42 -0500

VMHOANG68 gravatar image

updated 2016-10-24 09:11:02 -0500

ngrennan gravatar image

Hello,

I am doing a project that is combining the data from visual odometry (camera ASUS Xtion Pro live) with IMU (SparkFun razor). I want to combine some sources of information to have the best odometry and i start with visual odometry and IMU. I used package rtabmap_ros node rgbd_odometry for visual odometry and package razor_imu_9dof for IMU data. To combine 2 sources, i used package robot_localization node ekf. I published 2 static tf (/base_link /camera_link; /base_link /base_imu_link). I intend to fuse x, y, z, roll, pitch, yaw from visual odometry and roll, pitch, yaw from IMU. My filtered odometry is worse than the odometry from visual odometry. The pose is drift very quick even in case that i don't move the camera. I saw the covariance matrix of filtered odometry data is always increased. The covariance of imu data is set as constant. The covariance of visual odometry is changed frequently but it's small values (the type of visual odometry is nav_msg/odometry but data is existed in geometry_msgs/PoseWithCovariance, geometry_msgs/TwistWithCovariance is always 0). I don't understand whether it affect the system or not. Can you give me an advice for configuring the filter or other solution to fuse 2 informations.

Another question is about the orientation of camera and IMU board. I setup the platform like this. Is the axis are correct?

PLATFORM

I attached here the NodeGraph,TFTree, node robot_localization configuration and static tf publish configuration.

Thank you in advance for your attention to this matter.

ROS_GRAPH TF_TREE

<node pkg="robot_localization" type="ekf_localization_node" name="ekf_localization" clear_params="true">

  <!-- ======== STANDARD PARAMETERS ======== -->

  <param name="frequency" value="10"/>

  <param name="sensor_timeout" value="0.5"/>

  <param name="two_d_mode" value="false"/>

  <param name="map_frame" value="map"/>
  <!-- Defaults to "odom" if unspecified -->
  <param name="odom_frame" value="odom"/>
  <!-- Defaults to "base_link" if unspecified -->
  <param name="base_link_frame" value="base_link"/>
  <!-- Defaults to the value of "odom_frame" if unspecified -->
  <param name="world_frame" value="odom"/>

  <param name="transform_time_offset" value="0.0"/>

  <param name="odom0" value="rtabmap/odom"/>
  <param name="imu0" value="/imu"/>

  <rosparam param="odom0_config">[true, true, true,
                                  true, true, true,
                                  false,  false, false,
                                  false, false, false,
                                  false, false, false]</rosparam>

 <rosparam param="imu0_config">[false, false, false,
                                 true,  true,  true,
                                 false, false, false,
                                 false,  false,  false,
                                 false,  false,  false]</rosparam>

  <param name="odom0_differential" value="false"/>
  <param name="imu0_differential" value="false"/>

  <param name="odom0_relative" value="false"/>
  <param name="imu0_relative" value="false"/>

  <param name="imu0_remove_gravitational_acceleration" value="true"/>

  <param name="print_diagnostics" value="true"/>

  <!-- ======== ADVANCED PARAMETERS ======== -->

  <param name="odom0_queue_size" value="2"/>
  <param name="imu0_queue_size" value="10"/>

  <param name="odom0_pose_rejection_threshold" value="0.1"/>
  <param name="odom0_twist_rejection_threshold" value="0"/>
  <param name="imu0_pose_rejection_threshold" value="0.5"/>
  <param name="imu0_twist_rejection_threshold" value="0"/>
  <param name="imu0_linear_acceleration_rejection_threshold" value="0.1"/>

  <param name="debug"           value="false"/>

  <param name="debug_out_file"  value="debug_ekf_localization.txt"/>

  <rosparam param="process_noise_covariance">[0.001, 0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                                              0,    0.001, 0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                                              0,    0,    0.001, 0,    0,    0,    0,     0,     0,    0,    0 ...
(more)
edit retag flag offensive close merge delete

Comments

Please provide sample input messages from your sensors.

Tom Moore gravatar image Tom Moore  ( 2016-03-01 12:21:57 -0500 )edit

hello, had you been able to successfully fuse VO and IMU using robot_localization? Could you give me your config file or launch file? I want to know where I made mistakes. Thanks for your help~:)

zhchpan gravatar image zhchpan  ( 2018-01-20 02:19:42 -0500 )edit

Please don't use an answer to ask a question. This isn't a forum. Please create a new question.

jayess gravatar image jayess  ( 2018-01-20 03:07:44 -0500 )edit

2 Answers

Sort by ยป oldest newest most voted
0

answered 2016-02-21 16:34:57 -0500

Mos gravatar image

updated 2016-02-22 18:41:20 -0500

matlabbe gravatar image

My advise is because you feed absolute position information through your visual odometry, then you need to set the parameter " <param name="odom0_differential" value="false"/>".

Have a read again about how to set the parameters for ekf_localization properly.

edit flag offensive delete link more

Comments

Just for clarity: I guess what Mos means is setting that parameter to true. You are trying to fuse two absolute orientation data sources that are not aligned, which will not work well.

demmeln gravatar image demmeln  ( 2016-03-02 06:21:05 -0500 )edit
1

answered 2022-03-01 21:30:47 -0500

SahanGura gravatar image

updated 2022-03-01 21:32:31 -0500

I am currently dealing with the same issue. When the rgbd_odometry node loses the odometry, EKF cannot understand it and it tries to predict odometry using the state transition model without measurement data (in case of IMU has not been used). Therefore the generated map starts to distort, from the moment the first odometry loss happened. There should be a way to communicate this odometry losing moment to the EKF, so it can stop predicting inaccurate odometry values. That is what happens when we feed visual odometry generated by rgbd_odometry node to the rtabmap node directly. rtabmap node identifies when odometry is lost, and it stops mapping until odometry is properly given.

edit flag offensive delete link more

Comments

Beware that when odometry is lost on rtabmap side, it publishes null transform to tell listening nodes that odometry cannot be computed. With robot_localization, you may not want to publish odometry when it is lost with publish_null_when_lost set to false and use only twist set in the odometry topic for fusion in robot_localization. Odom/ResetCountdown would need to be set to 1 to automatically reset odometry when lost.

matlabbe gravatar image matlabbe  ( 2022-03-06 22:32:09 -0500 )edit

The EKF needs to continue to produce a state estimate; what would you expect it to do in this instance? Stop producing all output? All the transforms it's meant to produce will time out.

In the end, your issue is your sensor data, rather than the sensor fusion itself. If you lack wheel encoders, then my advice to you would be to feed your commanded velocity to the EKF as a twist input source. Give it a large covariance so that it's not heavily trusted when your rgbd_odometry is functional.

Tom Moore gravatar image Tom Moore  ( 2022-05-24 02:38:19 -0500 )edit

Question Tools

4 followers

Stats

Asked: 2015-11-26 10:32:42 -0500

Seen: 4,555 times

Last updated: Mar 01 '22