ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
7

REP-105 and robot_localization

asked 2016-02-17 08:11:01 -0600

vschmidt gravatar image

Hello,

I have a series of questions about REP-105 and the robot_localization package.

What is confusing to me is the utility of the odom reference frame and it is possible that I don't understand its definition. REP-105 describes the odom reference frame as a world-fixed reference frame. But it also says that the pose of a robot can drift in the odom reference frame without any bounds.

I understand the problem. A vehicle whose position is estimated from dead-reckoning alone will accrue position estimate errors that will grow unbounded unless checked by an absolute position measurement.

So is the idea that one can track the dead-reckoned position estimate and the error between that and the true position separately? Perhaps this is done in the transform between the map frame and the odom frame?

What is the utility of the odom reference frame vs the map reference frame? Why is having a continuous "no-jumps" reference frame better than having a globally accurate one? If one is using an EKF and there are jumps in the positioning due to GPS data, does that not mean that the measurement covariance (or possibly the process covariance) are not set properly? It seems to me one is relying on the GPS too much. Maybe it means the IMU is too poor quality to adequately filter the GPS? I guess what I'm getting at is that it seems to me that a properly modeled and tuned EKF should (barring resets) not have jumps and the odom frame might be unnecessary.

I can't wait to hear your thoughts!

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
9

answered 2016-03-01 13:00:12 -0600

Tom Moore gravatar image

Imagine that your pose estimate is derived from fused GPS, wheel odometry, and IMU data. You tell your robot, whose position is (0, 0) in your world frame, to move to (10, 10), and it starts to move there. Along the way, the GPS jumps to (11, 11), and the robot suddenly thinks it's on the other side of the goal, so it tries to turn around. Then it jumps back to (9, 9), and it turns around again. The issue is that controlling your robot in a discontinuous frame can be a problem. Note that this isn't limited to GPS data: nodes that do actual localization (e.g. amcl) may undergo very small jumps in pose, but those may be enough to upset the control of the robot. One can imagine a situation where the threshold on your waypoint satisfaction is N radians, but the localization system produces measurements with error > N, and under-reports its covariance. In that instance, you may never "reach" the goal.

In this case, it isn't just the EKF parameters that need to be tuned, but the covariance matrices in the measurements themselves. If the measurement covariance is overconfident, then the filter output will have discrete jumps that match the input. Unfortunately, this is a common occurrence with drivers and localization packages.

Furthermore, REP-105 is a standard that applies to all ROS nodes, and not just those that perform fusion. You can simply run a robot controller that produces odometry in the odom frame and a localization node (e.g., amcl) that produces pose estimate in the map frame, and not bother with any sensor fusion packages like robot_localization. In that case, no fusion is occurring, so if you want to control your robot, you'll want to do so in the odom frame, as the map frame data is subject to discontinuities.

In any case, you are correct that with proper error management, you can mitigate these effects and work with a single frame. Some companies have localization solutions that are so good that they can control their robot in the map frame (i.e., fusing the localization data) and not ever run into trouble with discontinuities.

edit flag offensive delete link more

Comments

Thanks, this is great. I appreciate the time.

I will admit however, it is odd to my naive eye to accommodate in reference frames what seems to me to be a misconfigured system.

vschmidt gravatar image vschmidt  ( 2016-03-04 12:16:50 -0600 )edit

What seems misconfigured to you?

demmeln gravatar image demmeln  ( 2016-03-04 15:12:34 -0600 )edit

If the gps position jumps so, and one's filtered position is then discontinuous, then would it not indicate that the gps covariance is too small? Also, seems to me I f the gps error is large then one's sense of having arrived needs to be appropriately sized.

vschmidt gravatar image vschmidt  ( 2016-03-04 19:22:02 -0600 )edit

Also, in some commercial systems gps error is not assumed white. And in some, large jumps in position are not accepted into the Kalman solution until corroborated by many fixes. I'm not saying this strategy is wrong, just different from my own experience.

vschmidt gravatar image vschmidt  ( 2016-03-04 19:30:08 -0600 )edit

If the gps position jumps so, and one's filtered position is then discontinuous, then would it not indicate that the gps covariance is too small? --> too small covariance can be a cause, but doesn't have to be. See my answer below for a situation with perfect covariances, yet still a jump in pose.

demmeln gravatar image demmeln  ( 2016-03-04 20:32:13 -0600 )edit

As for the more advanced GPS error / outlier handling: You are right, this is currently not implemented in robot_localization, but could be implemented on top of it with some node that pre-filters the GPS data before feeding it to robot_localization. Still, to me this seems independ from odom frame.

demmeln gravatar image demmeln  ( 2016-03-04 20:34:11 -0600 )edit

You can reject individual outliers using the ...rejection_threshold parameter. If GPS is your only absolute pose estimate source, then after you reject a few measurements (so you have no pose measurements), the estimate covariance will grow, which should cause previous outliers to become inliers.

Tom Moore gravatar image Tom Moore  ( 2016-03-05 06:47:16 -0600 )edit
5

answered 2016-03-02 06:12:50 -0600

demmeln gravatar image

On top of what Tom said, there are also situations where even with proper covariances and filter configuration, you might get very large discutinuities in your map frame, while the odom frame can still serve as a continuous frame for control. Consider the setup mentioned by Tom with odom+IMU+GPS, and all covariances are set up perfectly. Now if the robot goes to an area without GPS coverange (e.g. inside or close to a building), and the GPS does not provide any input for a certain period of time, the filter will estimate position purely on odometry data. The positional covariance will correctly grow by a substantial amount in this time. When the robot gets the GPS fix again, and the first GPS position with a (correct) small variance is fused, the filter position will (correctly) jump close to that measurement immediately.

Now if you assume your odometry can drift lot at (without being corrected for certain periods of time), you will run into additional problems. E.g. when doing global planning, your bad position estimate might result in abstacles from a map and obstacles from onboard sensors (e.g. laser scanner) not overlapping anymore and as a whole blocking all paths. If your localization can get that bad at certain points, you will have to deal with that regardsless of whether you are using a single frame, or the map->odom->base_link setup. However, the odom frame can still have the invariant, that over a short period of time is a good approximation of the relative movement of the robot, regardless of what happens to global localization. Sometimes you need that property e.g. for control, or maybe for planning local maneuvers that don't depend on a global position.

Apart from that, there is actually one more utility to the map->odom->base_link tf frame setup. It provides a simple means to fuse slower global localization with a more frequent odometry. E.g. in the amcl+wheel_odom setup, you might have odom->base_link updating at 50Hz from your wheel ancoders, but map->odom updating only at 10Hz. Still, on the TF tree you will get the chain map->base_link updating at 50Hz (provided you set up the transform_tolerance appropriately, which addmittedly is a bit of a kludge). Now you can certainly acheive the same with "proper" sensor fusion algorithms like a kalman filter and just a single frame, but this is how it has been used in simple, default setups in ROS for a long time.

edit flag offensive delete link more

Question Tools

4 followers

Stats

Asked: 2016-02-17 08:11:01 -0600

Seen: 2,653 times

Last updated: Mar 02 '16