ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | Q&A answers.ros.org

# Robot_localization result is unpredictable

Hello,

I very recently posted a question about the robot_localization package, and received some very good insight. I managed to fuse my wheel odometry with my IMU readings, but the result is not what I was expecting. I'm noticing a very significant bit of lag in updating the filtered transform. My initial approach was to start out trying to fuse all the data at once, but I quickly learned I needed to try this with individual data (i.e. just velocity of odometry). Hopefully showing my setup and some example videos will help explain the problem I'm having.

In the following example, I'm attempting the simplest setup possible. I only want the EKF to use my odometry's forward velocity value. Here's the setup & config file I'm using for this:

On ros-kinetic:

frequency: 30
sensor_timeout: 0.1
two_d_mode: true
transform_time_offset: 0.0
transform_timeout: 0.0
print_diagnostics: true
publish_tf: true
publish_acceleration: false
use_control: false

#map_frame: map              # Defaults to "map" if unspecified
#odom_frame: odom_raw            # Defaults to "odom" if unspecified
#world_frame: odom           # Defaults to the value of odom_frame if unspecified

odom0: odom

#x, y, z, roll, pitch, yaw, vx, vy, vz, vroll, vpitch, vyaw, ax, ay, az
odom0_config: [false, false, false,
false, false, false,
true , false, false,
false, false, false,
false, false, false]

odom0_queue_size: 10
odom0_nodelay: false
odom0_differential: false
odom0_relative: false
odom0_pose_rejection_threshold: 5
odom0_twist_rejection_threshold: 1

process_noise_covariance: [0.05, 0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
0,    0.05, 0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
0,    0,    0.06, 0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
0,    0,    0,    0.03, 0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
0,    0,    0,    0,    0.03, 0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
0,    0,    0,    0,    0,    0.06, 0,     0,     0,    0,    0,    0,    0,    0,    0,
0,    0,    0,    0,    0,    0,    0.025, 0,     0,    0,    0,    0,    0,    0,    0,
0,    0,    0,    0,    0,    0,    0,     0.025, 0,    0,    0,    0,    0,    0,    0,
0,    0,    0,    0,    0,    0,    0,     0,     0.04, 0,    0,    0,    0,    0,    0,
0,    0,    0,    0,    0,    0,    0,     0,     0,    0.01, 0,    0,    0,    0,    0,
0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0.01, 0,    0,    0,    0,
0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0.02, 0,    0,    0,
0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0.01, 0,    0,
0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0.01, 0,
0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0.015]

initial_estimate_covariance: [1e-9, 0,    0,    0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
0,    1e-9, 0,    0,    0,    0,    0,    0,    0 ...
edit retag close merge delete

Sort by » oldest newest most voted

So my advice is to fuse one sensor at a time, but that comes with two caveats:

1. Don't fuse one _variable_ at a time. If a given sensor provides more than one variable, fuse everything you want from that sensor. Look at your odometry config:

odom0_config: [false, false, false,
false, false, false,
true , false, false,
false, false, false,
false, false, false]


You are fusing _only_ X velocity. The filter is estimating a 15-dimensional state, so every relevant pose variable (in this case, as you have two_d_mode set to _true_, that means X, Y, and yaw) needs to be either measured as a pose (i.e., directly) or as a velocity. Please change your odometry config to this (set Y velocity and yaw velocity to _true_).

odom0_config: [false, false, false,
false, false, false,
true , true, false,
false, false, true,
false, false, false]


As you have it, the filter has to make predictions of the future state based on variables that are not being measured. The explosion in covariance will mean that the output will be completely unstable.

2. If you have a sensor that only produces a single variable (e.g., a barometer that only produces Z position), then you should send some "fake" values for all the other variables to "clamp" them to a fixed value (read: 0). That, or just fuse the 0 values that come in for those variables, e.g., if the barometer data comes in on a PoseWithCovarianceStamped message, then go ahead and fuse all the other pose variables, which will probably have a 0 value anyway.

My initial guess as to why this would be occurring would be this package doesn't handle the fusing of one data point very well, but that would surprise me. I have noticed that the acceleration values in the state matrix are non-zero, and this was pretty confusing to me, considering I never introduced acceleration values.

The magic of matrix inversion and correlated variables means that, even if you don't directly measure, e.g., acceleration, but you do measure velocity, the filter will "create" the accelerations.

Just as a note, I just tried inputing all my data into robot_pose_ekf, and I got a much more realistic result.

That's because, IIRC, robot_pose_ekf doesn't let you control what you're fusing from the sensor input. It just grabs all the 2D pose data from your input messages and fuses it. Increased flexibility, in this case, equates to increased complexity.

more