ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

position estimation only based on geometry_msgs::Twist

asked 2022-03-20 08:48:26 -0500

Laschoking gravatar image

Hi, I am using Rosbot 2.0 pro, running kinetic (can't upgrade due to some dependencies).

So far I used the rosbot_ekf package (which calls the robot_localization package) for navigation ( rostopics like /odom, /pose are available). As far as I understand robot_localization calculates the position based on sensor data from wheel encoders, IMU etc. I would like to compare this estimated value (the most likely position) with my "ground truth" (where the robot should have been, if it would work without errors). For example when I execute following command (in position [x = 0,y = 0, z=0]) : rostopic pub -1 /cmd_vel geometry_msgs/Twist wist "linear: x: 1.0 y: 0.0 z: 0.0 angular: x: 0.0 y: 0.0 z: 0.0"

the robot_localization might return a new position [x = 1.01, y= -0.001, z=0] because it detected some wheel slips etc. However the value I am missing is the "ground truth" of [x = 1.00, y=0.0, z=0]. This is an easy example, but for advanced tasks (like operation via the teleop_twist_keyboard) the "ground truth" value is difficult to calculate.

I am interested in this value, because I can imagine that it has some use in position estimation, besides my sensor data.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
2

answered 2022-05-18 06:42:34 -0500

Tom Moore gravatar image

Just so we're on the same page:

  • None of the nodes in robot_localization can detect or account for wheel slip. However, I think what you're saying is "if the wheels slipped, the EKF won't know that, so it will think the robot has moved forward, even if it hasn't." This is accurate.
  • So what you are looking for is a ground-truth measurement of where your robot really is. Let me know if I have that wrong.

Assuming I am correct, the only way to obtain a ground truth estimate will be through the use of some kind of external measurement system, e.g., a motion capture system or overhead camera system. If there was on "on robot" way to know the robot's true pose, there would be no need for state estimation packages, because you'd already have the robot's true pose.

As an alternative, does the manufacturer of your robot provide a Gazebo simulation of it? You could directly compare the output of the EKF or UKF with Gazebo's perfect ground truth of the robot's true state. However, it will likely require some effort to model effects like wheel slip.

edit flag offensive delete link more

Comments

Another possible approach could be to use parameterized curves (such as line segments, arc curves, Bezier curves etc.) with the assumption of constant velocity / calculating how much progress should be achieved with a given controls. The estimated position could then be compared to the expected one.

ljaniec gravatar image ljaniec  ( 2022-05-18 08:24:11 -0500 )edit

But that would assume control is perfect. The robot's error in its pose will be the result of a combination of other errors, including error in the control of the robot itself. You'd be masking that effect if you went this route.

Tom Moore gravatar image Tom Moore  ( 2022-05-24 02:42:38 -0500 )edit

Yes, that's the flaw in this approach, you're right. However, when combined with the current position of the robot from an external system like OptiTrack and the predicted position on the paths assuming no control errors, it gives a better picture of how much error occurred in the pose estimation. If there is no external localization system available, it's better than nothing.

ljaniec gravatar image ljaniec  ( 2022-05-24 02:54:48 -0500 )edit
1

Hi, @tom and @Ijaniec, Thank you for your answers. I was looking for a position estimation by only listing to "cmd_vel" commands. This would be similar to a (very useless) robot_loc. node that doesn't use sensors and only relies on data from the "use-control" commands. This would allow, as Ijaniec stated, to compare the "cmd_vel" position to some other, measured position. The term "velocity motion model" (from Thrun & Fox; Prob. Robotics) probably describes best, what i wanted. However in the book is stated, that a motion model based on odometry is more accurate. So I decided to stick with normal localization techniques.

Laschoking gravatar image Laschoking  ( 2022-07-25 08:55:40 -0500 )edit
2

If you output a copy of the cmd_vel that was stamped (i.e., a geometry_msgs/TwistWithCovarianceStamped messaage), then you can fuse that into the filter directly.

Tom Moore gravatar image Tom Moore  ( 2022-07-25 09:21:43 -0500 )edit

Question Tools

1 follower

Stats

Asked: 2022-03-20 08:48:26 -0500

Seen: 162 times

Last updated: May 18 '22