ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

There's quite a bit going on here!

If you can update your question and add some sample messages from each sensor source, I'd appreciate it.

I'd be very careful about fusing multiple sources of absolute pose information. I don't know enough about the SLAM package in question to comment on how it behaves, but if the package has any divergence or has a poor loop closure (is it running live SLAM, or just providing a pose estimate in your map?), then the SLAM position estimate and the GPS position estimate are going to differ wildly, and it'll just jump back and forth between those state estimates.

The same goes for your IMU data. You are fusing absolute orientation data from your IMU, but there's no guarantee that it will match your SLAM positions. For example, if you start your robot in a pose where the IMU reads pi/4 radians, and then drive straight forward one meter, your SLAM package, if not using your IMU absolute orientation, is going to probably read an XY position of (1, 0), but that disagrees with the orientation of the IMU, which would suggest that we should be at (0.707, 0.707) after driving forward one meter.

In any case, if you are fusing multiple absolute pose sources, you need to make sure they are in the same coordinate frame, or that they have a transform defined that will make them so.

My suggestion to you is to start with just the "tier 1" (odom frame) EKF. Get that looking the way you want, then move on to the second tier (map frame) EKF. If I were you, I'd probably fuse my full 3D pose from the SLAM package in the odom-frame EKF, and maybe just fuse the angular velocities from the IMU. If you lack a velocity measurement source, I'd stop fusing the linear acceleration data from the IMU, too.