ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Can robot_localization fuse raw gyro/accel data with odometry ?

asked 2016-09-09 12:41:47 -0500

Bhavya gravatar image

I have a robot set up with accelerometer/gyroscope sensors and wheel encoder-based odometry. The inertial sensors provide linear accelerations and angular velocities only - i.e. they are not being used to generate a quaternion estimate. I have been trying to correct the quaternion estimate produced by the wheel odometry since it is subject to slippage. However, my experiments seem to indicate that in order for the robot_localization to correct the odometry orientation estimate, the IMU must provide an orientation. Have I understood the robot_localization package correctly??

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2016-10-09 04:54:21 -0500

Tom Moore gravatar image

Most IMUs produce three sets of data:

  1. Orientation
  2. Angular velocity
  3. Linear acceleration

This maps well to the sensor configuration for r_l. If you check out the wiki, specifically the section on parameters, you'll see that you can specify an IMU input like so:

<rosparam param="imu0_config">[false, false, false,
                               false, false, false,
                               false, false, false,
                               true,  true,  true,
                               true,  true,  true]</rosparam>

I'll leave it to you to look at the ordering of the parameters on the wiki, but this configuration will use only the angular velocity and linear acceleration from an IMU.

edit flag offensive delete link more


what about using gyro data only? are angular velocities enough for a correct state estimation?

Jasmin gravatar image Jasmin  ( 2017-03-15 09:35:58 -0500 )edit

It depends on which subset of the state you care about. You can estimate the orientation using just a gyroscope, but if you want to estimate positions / linear accelerations, you would at least need an accelerometer.

Bhavya gravatar image Bhavya  ( 2017-03-15 09:58:33 -0500 )edit

OK, thank you @Bhavya

Jasmin gravatar image Jasmin  ( 2017-03-15 14:52:11 -0500 )edit

@Tom Moore What if I want to use yaw measurements from the 'orientation'? What would the yaml file look like? Assume "orientation" is represented as a quaternion.

lffox gravatar image lffox  ( 2017-07-04 15:23:59 -0500 )edit

@Tom Moore Would really appreciate your reply on this

lffox gravatar image lffox  ( 2017-07-05 13:53:50 -0500 )edit

@lffox: please show a little patience. Your last comment was only 23 hours ago. All contributors to ROS Answers do this on a voluntary basis, meaning they don't get paid. I'm sure @Tom Moore will respond whenever he has time (and feels like it).

gvdhoorn gravatar image gvdhoorn  ( 2017-07-05 14:57:43 -0500 )edit

@gvdhoorn I understand. I didn't mean to offend anyone here. I re-commented in the off chance he missed to see it. Also this information was crucial to something I was working and there is a time constraint. I hope u understand.

lffox gravatar image lffox  ( 2017-07-05 17:16:53 -0500 )edit

@lffox The wiki has a section on sensor configuration and the variable ordering. Quaternions get converted to Euler angles internally.

Tom Moore gravatar image Tom Moore  ( 2017-07-06 05:17:40 -0500 )edit

Question Tools



Asked: 2016-09-09 12:41:47 -0500

Seen: 1,093 times

Last updated: Oct 09 '16