providing external odometry to the ROS navigation stack
I am using a combination Realsense T265 and T435 and try to improve turtlebot2 navigation capabilities using RtabMap. The slam algorithm seems to work just fine, and the cameras position is quite accurately tracked. The problem comes when trying to use the ros navigation stack, which considers the robot location as being basefootprint, and not the my cameras position(which is more accurate). My tf trees starts with map-> odom which was two children: basefootprint and t265odomframe. Is there way of linking the basefootprint as a child of t265odomframe? /odom topic contains a set of /odom to /basefootprint tf transforms which are published by the mobilebasenodletmanager node. I assume I have to change this not and make it publish t265odomframe to basefootprint transforms, but I am not sure how and where I can do that.
As a reference, I started with this tutorial: rtab navigation and slightly modified to work with RS t265. In my demo, I can see how the robot model is following base_footprint while diverging from the pose estimation provided by the camera.
Thanks!
Asked by b.lazarescu on 2019-12-09 12:18:27 UTC
Answers
You may find the answer on the Turtlebot part of this Google Tango tutorial: http://wiki.ros.org/rtabmap_ros/Tutorials/Tango%20ROS%20Streamer#Turtlebot
There Google Tango is giving odometry to camera, not base_footprint
of turtlebot. The hack is to make base_footprint
the child of the camera, unless you can specify to T265 driver in which frame you want the odometry transformed (which would be base_footprint
instead of the camera). I think T265 can use wheel odometry as input too, maybe they have something already done for that.
cheers,
Mathieu
Asked by matlabbe on 2019-12-12 11:35:01 UTC
Comments