Ask Your Question
1

robot_pose_ekf on an inclined plane [closed]

asked 2012-04-18 21:32:31 -0500

ChickenSoup gravatar image

updated 2016-05-17 02:50:57 -0500

Hi all,

I have some questions about combining odometry with IMU for a rover navigating on a terrain.

1) (not necessarily related to ROS) What is the standard method to combine IMU and odometry for a uneven terrain rover?

  • Is it like, we assume a temporal plane based on IMU and use odometry data along that plane, and finally transform it to world frame?

2) In odom_estimation.cpp (robot_pose_ekf), absolute measurements are converted to relative odom measurements in horizontal plane, as well as only the yaw measurement of IMU is used.

  • So, does that mean robot_pose_ekf is only for 2D navigation? The package summary mentions that it has a 6D model (3D position and 3D orientation)

Thank you for your time.

CS

edit retag flag offensive reopen merge delete

Closed for the following reason the question is answered, right answer was accepted by Martin Günther
close date 2016-05-17 03:24:19.654018

1 Answer

Sort by » oldest newest most voted
0

answered 2012-04-19 03:52:59 -0500

DimitriProsser gravatar image

Have a look at my answer here. It talks about how one could construct the transform tree using an IMU.

As for robot_pose_ekf, it relies on its incoming sensor data to perform its calculations. Wheel odometry is always 2D, just by the nature of it. Your IMU data can publish a 3D orientation to robot_pose_ekf, but the robot's elevation (z-value) will never change until you have a source that publishes a z-value. I'm not sure if robot_pose_ekf will rotate the 2D plane of wheel odometry to a world reference frame (so that the IMU's orientation can influence elevation), since I've never used it. I'm just speaking from general experience when I say that I'm not sure that it does.

That leaves your GPS and visual odometry to publish elevation data. I can't speak for everyone, but I never take the z-measurement from my GPS unit (if you're using one). I find it to be terribly unreliable. That leaves only /vo (visual odometry) as your source for elevation calculations. If you publish a z-value to /vo, you should be able to achieve 3D navigation.

edit flag offensive delete link more

Comments

Thanks for your answer @DimitriProsser. I am not using vo at the moment. I think it will still assume odometry data to be on the horizontal plane even if I publish a z-value to /vo and the resulting x,y values will be wrong. May be the best is to re-write a robot_pose_ekf like node.

ChickenSoup gravatar image ChickenSoup  ( 2012-04-19 04:12:04 -0500 )edit

so that I can use IMU roll, pitch data to assume the temporal plane the robot moves on to calculate odometry.

ChickenSoup gravatar image ChickenSoup  ( 2012-04-19 04:29:37 -0500 )edit

If you do so, you might want to try rotating all of the sensor data into the world frame before calculation. You can then transform the data back to a relative frame for publishing.

DimitriProsser gravatar image DimitriProsser  ( 2012-04-19 04:33:36 -0500 )edit

The 2D plane used by wheel odometry is fixed in the x-y plane, and does _not_ rotate when the robot's pitch or roll angle changes. This was done to prevent the z-coordinate of the robot from drifting away. That said, it would be reasonable to dynamically rotate the plane for wheel odometry.

Wim gravatar image Wim  ( 2012-06-19 10:20:35 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2012-04-18 21:32:31 -0500

Seen: 613 times

Last updated: May 17 '16