ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Can T265 Odometry Pose be centred on robot rather than sensor when using wheel Odometry input?

asked 2021-07-30 04:33:22 -0500

SmallJoeMan gravatar image

updated 2021-07-30 17:07:22 -0500

We've setup our robot to use the T265 sensor for odometry using the official realsense-ros package (foxy branch). We've configured the launch to fuse input from the T265 camera and wheel odometry from our motor controllers.

It wasn't clear to me whether the Odometry output by the realsense node is centred on the robot or the T265 sensor when using wheel odometry input. So we did a quick experiment putting the sensor 0.7m in front of the robot out on a boom.

image description

We spun the robot on the spot and it is clear that the Odometry circles the robot, so it is evident that the Odometry is centred on the T265 sensor not the robot.

image description

Is there a setting that I have missed that allows the Odometry output to be centred on the robot (i.e. in base_link)? Otherwise we have to transform the Pose and Twist in a separate node, which is a pain.

Update:

The relevant parts of our TF tree is the following (ignoring other sensors like a rgb camera which also comes off base_link):

map -> odom -> base_link -> odom_sensor

  • map -> odom for this test is a static zero offset and rotation transform.
  • odom is the frame in which the robot moves in starting from x=0,y=0,rotation=0.
  • base_link is the robot frame and is centred between the differential drive wheels.
  • odom_sensor is the frame of the t265, which is normally ~10cm behind the drive wheels but for this test was 70cm in front on a boom.

We have configured the t265 node to output Odometry with the Pose in odom (header.frame_id = odom). It is not clear if the Twist (child_frame_id) should be in odom_sensor or base_link, but our experiments seem to imply that sensor is outputting Twists in odom_sensor. We therefore need to offset the Pose in odom to account for the translation between base_link and odom_sensor and also transform the Twist so that, for example, if you are spinning on the spot the Twist in base_link has zero linear velocity and and non zero angular velocity in z.

We currently feed Odometry from the motor controllers into the T265 node with the Twist in base_link (i.e. linear velocity is always along x and angular velocity along z), however I wonder if that is incorrect as I note that the node only uses linear velocity.

edit retag flag offensive close merge delete

Comments

The odom frame does not work in the way you are describing it. Please tell us what your transform tree frame names are along a path from the "root_node_of_tree" down to the t265_sensor.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2021-07-30 15:12:13 -0500 )edit

Thanks for replying, I've updated my question with details of our reference frames. I hope it makes sense.

SmallJoeMan gravatar image SmallJoeMan  ( 2021-07-30 17:04:58 -0500 )edit

For the tf tree please generate an image with rqt_tf_tree, it is easier to read. Also please attach a sample odometry message (e.g. output of rostopic echo -1 /odom) for the situation of the second image.

Humpelstilzchen gravatar image Humpelstilzchen  ( 2021-08-01 02:16:32 -0500 )edit

Nice update to your question. You explained it very clearly.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2021-08-01 09:24:43 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2021-08-01 09:23:53 -0500

Mike Scheutzow gravatar image

updated 2021-08-01 10:25:09 -0500

Disclaimer: I've never used a t265, but I have looked at the data sheet and the realsense API. One thing I was not aware of on my first read of this question is that the wheel-odometry fusion is happening inside the t265, and that the t265 is choosing the origin of the odom frame outside your control. So that makes me think your TF tree really looks like this:

map -> odom -> odom_sensor -> base_link

To answer your question: I didn't find any realsense API that lets you specify a "post-transform" operation to the odom_sensor pose.

The simplest hack would be to mount the t265 at base_link - this makes the transform to base_link be the Identity Transform, so no calculation is necessary. If that is not possible, then I think you have no choice but to run a ros node to explicitly calculate base_link pose & velocity.

Followup:

Take a look at this page: https://github.com/IntelRealSense/lib...

On that page, Intel writes the following, which I think is in agreement with my proposed TF tree above:

Extrinsic Calibration for Wheeled Odometry Examples

All calibration metrics are relative to T265 origin frame. I.e.: They are offsets/rotations from the T265 origin to the robot's origin. Said another way, we transform a point from frame B (Robot Odometry) to frame A (T265 Pose) or A_p = H_AB * B_p, where A_p is the point expressed in frame A, B_p is the point expressed in frame B, and H_AB is the corresponding homogeneous transformation.

In order for the T265 to consume and use odometry data correctly it must know where the robot's origin is relative to itself. This is accomplished by feeding a calibration (json) file into the librealsense API with this initial data.

In this basic sample robot setup, for simplicity, we assume the X-axis of the camera and robot's coordinate system are aligned (+X coming out of the screen) and only concern ourselves with the Y and Z axis.

edit flag offensive delete link more

Comments

Yes, thankyou. I think you are right that we'll need to position the sensor at the robot pivot, or transform the Pose and Twist ourselves. I think odom -> odom_sensor -> base_link is probably a more natural tree structure, but ultimately it doesn't get around the fact that the transforms need to be applied in a separate node.

From the Intel readme I thought the same in that it seems the positions of the 'velocimeter' sensors are relative to the t265, so that their algorithm can know how to interpret the velocimeter data and update the internal tracker. However, it should be straight forward to invert that. Surely we're not the only people who want to track the robot and not the sensor!

SmallJoeMan gravatar image SmallJoeMan  ( 2021-08-02 05:17:46 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2021-07-30 04:33:22 -0500

Seen: 244 times

Last updated: Aug 01 '21