ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
1 | initial version |
I assume you have a ROS node that is correctly publishing an Odometry transform. If you don't look at this tutorial:
Next, you should also have a ROS driver for your ranging sensor. If so, your distance measurements (published as a ROS message) are expressed in the sensor reference frame.
Then you need to set up a static transform in order to express where the sensor is located with respect to your robot frame. There are several ways how to to this, for instance:
Here is an example on how to set up static transforms from the launch file:
<launch>
<node name="camera_pose" pkg="tf2_ros" type="static_transform_publisher"
args="1.2 0.0 0.0 0.0 0.0 0.0 base_link camera_link" />
</launch>
TO-DO: set these actual transform values based on the real robot. Under args you define an x/y/z offset in meters and yaw/pitch/roll in radians. (yaw is rotation about Z, pitch is rotation about Y, and roll is rotation about X).
Here are some other resources to get you started:
How to write your own ROS node that sets a static transform broadcaster:
- tf2 static broadcaster (C++)
- tf2 static broadcaster (Python)
If you choose to follow the URDF + robot_state_publisher path:
- URDF tutorials
- Using the robot state publisher on your own robot
To fire up a static_transform_publisher from the terminal run the following command:
rosrun tf2_ros static_transform_publisher 1.2 0 0 0 0 0 1 base_link camera_link
Finally to look at the TF tree you can generate a TF tree PDF file with this command:
rosrun tf view_frames
PD: I also assumed that you are using a real robot, not a simulated one (Gazebo)
2 | No.2 Revision |
I assume you have a ROS node that is correctly publishing an Odometry transform. If you don't look at this tutorial:
Next, you should also have a ROS driver for your ranging sensor. If so, your distance measurements (published as a ROS message) are expressed in the sensor reference frame.
Then you need to set up a static transform in order to express where the sensor is located with respect to your robot frame. There are several ways how to to this, for instance:
Here is an example on how to set up static transforms from the launch file:
<launch>
<node name="camera_pose" pkg="tf2_ros" type="static_transform_publisher"
args="1.2 0.0 0.0 0.0 0.0 0.0 base_link camera_link" />
</launch>
TO-DO: set these actual transform values based on the real robot. Under args you define an x/y/z offset in meters and yaw/pitch/roll in radians. (yaw is rotation about Z, pitch is rotation about Y, and roll is rotation about X).
Here are some other resources to get you started:
How to write your own ROS node that sets a static transform broadcaster:
- tf2 static broadcaster (C++)
- tf2 static broadcaster (Python)
If you choose to follow the URDF + robot_state_publisher path:
- URDF tutorials
- Using the robot state publisher on your own robot
To fire up a static_transform_publisher from the terminal run the following command:
rosrun tf2_ros static_transform_publisher 1.2 0 0 0 0 0 1 base_link camera_link
Finally to look at the TF tree you can generate a TF tree PDF file with this command:
rosrun tf view_frames
PD: I also assumed that you are using a real robot, not a simulated one (Gazebo)
3 | No.3 Revision |
I assume you have a ROS node that is correctly publishing an Odometry transform. If you don't look at this tutorial:
Next, you should also have a ROS driver for your ranging sensor. If so, your distance measurements (published as a ROS message) are expressed in the sensor reference frame.
Then you need to set up a static transform in order to express where the sensor is located with respect to your robot frame. There are several ways how to to this, for instance:
Here is an example on how to set up static transforms from the launch file:
<launch>
<node name="camera_pose" pkg="tf2_ros" type="static_transform_publisher"
args="1.2 0.0 0.0 0.0 0.0 0.0 base_link camera_link" />
</launch>
TO-DO: set these actual transform values based on the real robot. Under args you define an x/y/z offset in meters and yaw/pitch/roll in radians. (yaw is rotation about Z, pitch is rotation about Y, and roll is rotation about X).
To publish a transform from odometry to map you need a node that can solve the global localization problem.
For instance you could use AMCL. AMCL is a bit broad subject to discuss in details.
Here are some other resources to get you started:
How to write your own ROS node that sets a static transform broadcaster:
- tf2 static broadcaster (C++)
- tf2 static broadcaster (Python)
If you choose to follow the URDF + robot_state_publisher path:
- URDF tutorials
- Using the robot state publisher on your own robot
To fire up a static_transform_publisher from the terminal run the following command:
rosrun tf2_ros static_transform_publisher 1.2 0 0 0 0 0 1 base_link camera_link
Finally to look at the TF tree you can generate a TF tree PDF file with this command:
rosrun tf view_frames