ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

Robot_localization IMU frame tf

asked 2018-04-03 14:46:29 -0500

mewbot gravatar image

Im running a simulation in Gazebo combining the odomery data (coming from the differential drive plugin) and the imu data (from GazeboRosImuSensor plugin). Im quite new to ROS and im totally confused with the ENU and NED frames ..

This is what I did 1)placed the IMU in its neutral position

2)setup a static transform broadcaster which broadcasts my base_link to odom frame here i made the rotation.

this is the code snippet in my launch file

<node pkg="tf" type="static_transform_publisher" name="imu_tf_publisher" args="0 0 0 1.57 0 3.14 base_link imu_link 10"/>

I have rolled it by 180 and yaw by 90

I tested the IMU and it shows correct accelerations as shown ( http://docs.ros.org/lunar/api/robot_l... )

But when i visualise my odometrydatas, the unfiltered odometry data(from diffdrive plugin) and the filtered odometry data(from robot_localization) are in opposite orientation. This happens only when include my imu as an input to the robot_localization package.

edit retag flag offensive close merge delete

Comments

setting the imu differential to true solved the problem of opposite orientation. however im still not confident about my imu being properly oriented. can someone please confirm.

mewbot gravatar image mewbot  ( 2018-04-03 22:56:57 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
3

answered 2018-05-15 06:22:05 -0500

Tom Moore gravatar image

It's helpful to pose launch/config files and sample input messages for these, but I can probably answer this anyway.

Your wheel encoders and the differential drive controller have no concept of absolute orientation, so when you start driving a robot around, the diff drive controller always starts with a heading of 0 an integrates wheel velocities, giving you a heading (that is subject to drift).

The EKF is taking in velocity data from the wheel encoders and _absolute_ orientation data from the IMU. So when your robot starts, you get an IMU yaw measurement, and the EKF output matches it. Then you drive forward, and the EKF fuses those velocities in the correct frame, and so your EKF state estimate will have a different orientation than your wheel encoders.

In short: your wheel encoder odometry and IMU do not share a common heading. If they did, you wouldn't need an IMU. The filter is doing what it's designed to do: fusing data from two separate sources into one state estimate.

Using differential mode converts the IMU absolute orientation data to velocity, so you'll do away with that particular issue, though I question why you want your EKF state estimate to match your wheel encoder odometry.

edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2018-04-03 14:46:29 -0500

Seen: 1,588 times

Last updated: May 15 '18