ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | Q&A
Ask Your Question

Transform the map frame into custom global coordinate system

asked 2023-01-19 09:40:45 -0500

xdzye gravatar image

updated 2023-01-22 16:48:35 -0500

matlabbe gravatar image

I'm running RTABMap and mapping is working correctly with Lidar, odometry and RGBD sensor data. In order to overlay the generated map with a CAD model, I need to transform the map frame into the CAD models/lab environments custom coordinate system.
I'm using Apriltag markers with known (a priori) coordinates relative to the custom coordinate system and detect them while mapping. The translation part of the transformation from map_frame to my custom coordinate system origin is working fine with the help of Marker/Priors in RTABMap (see original question here). See this for the documentation regarding the parameter in RTABMap. Basically what this param does is, that one can input global coordinates and RTABMap will translate the map_frame according to the local tag detection coordinates and their corresponding global coordinates. Unfortunately, Marker/Priors will not rotate the map_frame into my custom coordinate system, because this is breaking REP105/3 by using custom coordinate systems instead of ROS coordinate system definitions and that's why it is not supported in RTABMap.

Is there a way to add the correct rotation manually while or after mapping by checking for the map_frame orientation and comparing the differences to the fixed custom coordinate system and then applying the right rotation angles?

Or, can I guarantee by modifying the map_framethat a fixed coordinate direction related to my environment is always present in the map_frame, based on the cardinal directions or other constraints? Normally, the direction and origin of the ROS coordinate frame is dependend on the starting angle/direction of the camera.
A small example, I want to achieve that the x-axis always points to the north, y always points to the east etc. while or after the mapping process. This would help, because I would just need to rotate axes in a static transform, because the rotation would be constant with this solution for every scan I do.
For better visualization, I quote this comment from

If you build a map with rtabmap and save it, you can then calculate the static rotation needed based on where the map x-axis ended up. Doing this dynamically is more difficult: you will need to read the initial magnetic orientation of the robot once before you begin the rtabmap mapping procedure.

I've tried to find out how I can implement it this way by asking the following question under the above mentioned answers.ros post:

Can you explain to me how to calculate the static rotation from RTABMap map_frame to my own custom coordinate system? How do I find out, where the axes of the ROS coordinate system point to after running rtabmap and/or dynamically and how much I need to rotate it to fit my custom coordinate system?
The translation part is already working inside RTABMap with the help of Marker/Priors parameter by setting global coordinates for the known accurate positions of the markers and detecting exactly these markers in local coordinates ...

edit retag flag offensive close merge delete



Marker/Priors will not rotate the map_frame into my custom coordinate system, because this is breaking REP105/3 by using custom coordinate systems instead of ROS coordinate system

Are you using a right-hand coordinate system or not?

If the answer is yes, are you aware that tf2 is designed to easily handle this situation?

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-01-20 08:17:56 -0500 )edit

Hello Mike, yes - the custom coordinate system is right-handed.

No, im not aware. I was working with tf1 mostly and only recently started using tf2. Can you provide me with a link?

Thank you very much for helping!

xdzye gravatar image xdzye  ( 2023-01-20 08:25:06 -0500 )edit

Feature-wise, tf1 is similar to tf2 - the methods just got cleaned up. You can create a new Frame, publish the transform between it and an existing frame, and you're done. Then tf2 provides methods to convert a pose from one frame to another. So where is your confusion?

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-01-21 09:34:42 -0500 )edit

My confusion lies in the fact that the rotation between my coordinate system and the ROS coordinate system provided by RTABMap is always different, because the origin and direction of the ROS coordinate system is relative to the camera direction and angle at which the SLAM algorithm is started on the robot. I don't know the transformation itself and need to compute it. Thats why I was asking if there is a way to have a fixed direction for example in cardinal directions so that the rotation is always the same and I can just apply a static transform for all starting points. Otherwise I do have the tf map -> tag from Apriltag package and I also measured world -> tag for some specific tags, which would mean I can compute world -> map with (world -> tag) * (map -> tag)^(-1) for these specific tags, right? Would ...(more)

xdzye gravatar image xdzye  ( 2023-01-22 06:25:55 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2023-01-22 08:04:19 -0500

Mike Scheutzow gravatar image

updated 2023-01-22 08:15:04 -0500

I can compute world -> map with (world -> tag) * (map -> tag)^(-1) for these specific tags, right?

Yes, that is correct. If using c++, you'll probably want to create tf::Transform objects for this calculation; you'll use the setOrigin(), setRotation() and inverse() methods.

Would that be the best solution?


edit flag offensive delete link more



Edited to remove my "4x4 matrix" comment. It turns out that internally tf::Transform uses a different representation.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-01-22 08:16:31 -0500 )edit

Ok, thank you!

What would be the best to deal with different estimated transforms between different tags/different ROS times? Should I just average them? I'm currently using Python and I noticed inverse() is not defined in that API. Guess I will need to implement my own inverse function or I switch over to the C++ API.

xdzye gravatar image xdzye  ( 2023-01-22 09:37:18 -0500 )edit

For python, look in tf.transformations. This old question #q229329 has inverse example.

For multiple reference points, to get started I'd just average them. If that turns out to give poor results, you could look at some kind of outlier-removal.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-01-22 10:05:29 -0500 )edit

Thank you very much. In the meantime I also got it working using Marker/Priors. They actually do support rotation as well, I just set the rotation parameters with wrong values.

xdzye gravatar image xdzye  ( 2023-01-22 11:29:18 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools



Asked: 2023-01-19 09:40:45 -0500

Seen: 95 times

Last updated: Jan 22