How to transform Odometry
I have an odometry source (VIO) which has a (static) transform from the base_link of the robot.
How can I properly transform the Odometry message to the base_link frame?
For position and orientation it's almost trivial with tf, but for Twist?
What about the covariances?
I know that I can set base_link
to the sensor position as I've seen commonly done but I want to compare the odometry to other sensors such as IMU.
Just modify the code according to your needs: Publishing Odometry Information over ROS (python)
Thank you @Orhan but I just want to transform the twist component in another frame. I can use tf library to transform the Pose, but not the twist
You can always create a topic like
/my_odom
with typenav_msgs/Odometry
and publish the odometry data there for watching the twist part and covariance. For imu, just like the odometry frame, start your node with IMU's frame atmap
frame location(just zero distance), and then and publish your IMU measurement changes over time totf
if you don't want to use ready-to-use imu drivers that just use your IMU's frame in your URDF, withrobot_state_publisher
.