ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

ROS Transform between robot and map

asked 2021-02-10 08:46:17 -0500

Nextar gravatar image

I have an odometry message which is the current position of my robot. (For example odometry.pose.pose.position (x=1, y=1, z=0)

Then I have a sensor mounted on the robot (x + 1.2, y +0, z+0)

Lets assume there is a point P1 in the sensor at (x = 2, y=0, t=0) Therefore it would be (x=3.2,y=0,z=0) in the robot frame (x=4.2, y=1, z=0) in world frame without orientation.

But how do I correctly calculate the position with respect to odometry orientation? Using geometry_msgs::Quaternion (odometry message) and geometry_msgs::Pose (Point P1)

Also, Could someone explain or link me to resources on how to do this in TF instead of manually calculating?

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
1

answered 2021-02-10 11:33:52 -0500

Roberto Z. gravatar image

updated 2021-02-10 11:49:45 -0500

I assume you have a ROS node that is correctly publishing an Odometry transform. If you don't look at this tutorial:

Next, you should also have a ROS driver for your ranging sensor. If so, your distance measurements (published as a ROS message) are expressed in the sensor reference frame.

Then you need to set up a static transform in order to express where the sensor is located with respect to your robot frame. There are several ways how to to this, for instance:

  • Starting a static_transform_publisher via launch file
  • As a ROS node
  • Using an URDF model of your robot and robot_state_publisher to broadcast TF values
  • Running a static_transform_publisher via command line tools

Here is an example on how to set up static transforms from the launch file:

<launch>
  <node name="camera_pose" pkg="tf2_ros" type="static_transform_publisher" 
    args="1.2 0.0 0.0 0.0 0.0 0.0 base_link camera_link" />
</launch>

TO-DO: set these actual transform values based on the real robot. Under args you define an x/y/z offset in meters and yaw/pitch/roll in radians. (yaw is rotation about Z, pitch is rotation about Y, and roll is rotation about X).

To publish a transform from odometry to map you need a node that can solve the global localization problem.
For instance you could use AMCL. AMCL is a bit broad subject to discuss in details.

Here are some other resources to get you started:

TF tutorials in general

How to write your own ROS node that sets a static transform broadcaster:
- tf2 static broadcaster (C++)
- tf2 static broadcaster (Python)

If you choose to follow the URDF + robot_state_publisher path:
- URDF tutorials
- Using the robot state publisher on your own robot

To fire up a static_transform_publisher from the terminal run the following command:

rosrun tf2_ros static_transform_publisher 1.2 0 0 0 0 0 1 base_link camera_link

Finally to look at the TF tree you can generate a TF tree PDF file with this command:

rosrun tf view_frames
edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2021-02-10 08:28:26 -0500

Seen: 1,042 times

Last updated: Feb 10 '21