Ask Your Question

problems with px4 realsense camera and rtab-map

asked 2021-10-08 21:19:51 -0600

kiprock gravatar image


I'm testing our RTAB-MAP for use with PX4 SITL. I think that I have everything set up ok, but when I rotate the drone around, the costmap gets these slices in it, as you can see in RVIZ.

I'm using the attached to the drone. I'm pretty sure it has to do with my camera intrinsics or maybe the incorrect placement on the drone.

I'm launching RTAB-MAP in a fairly standard way:

<include file="$(find rtabmap_ros)/launch/rtabmap.launch">
        <arg name="rtabmap_args"       value="--delete_db_on_start"/>
        <!-- <arg name="rtabmap_args"       value="-delete_db_on_start -Optimizer/GravitySigma 0.3" make it double dashes in args-->
        <!--arg name="frame_id"           value="sim_camera_left"/-->
        <arg name="visual_odometry"    value="false"/>
        <!--arg name="rgbd_topic"         value="/rgbd_image"/-->
        <arg name="odom_topic"         value="/mavros/local_position/odom"/>
        <arg name="rgb_topic"          value="/camera/camera/color/image_raw"/>
        <arg name="depth_topic"        value="/camera/camera/depth/image_raw"/>
        <arg name="camera_info_topic"  value="/camera/camera/color/camera_info"/>
        <arg name="approx_sync"        value="true"/>
        <arg name="queue_size"         value="10"/>
        <arg name="rviz"               value="true"/>
        <arg name="rtabmapviz"         value="true"/>

         <!-- use actionlib to send goals to move_base --> 
        <param name="use_action_for_goal" type="bool" value="true"/>
        <remap from="move_base"            to="/move_base"/>


Does anyone have suggestions as to how to resolve this problem? Thank you in advance!

image description image description

edit retag flag offensive close merge delete


It could be that this shows the error between localization and odometry, shown in queue_size. As when it localizes it paints a new picture, but due to queue it keeps śhowing the "old" one as well. It seems from the picture it could be 10 overlays. You might also try rtab-maps odometry. Iam not sure about the quality of your odometry source, but I have had bad experience with consumer IMU based odometry specially in yaw.

Dragonslayer gravatar image Dragonslayer  ( 2021-10-09 05:37:30 -0600 )edit

Either /mavros/local_position/odom is drifting, or if your odom is perfect in the simulator, I would say camera/odom synchronization may not be that great. Use odom_frame_id instead of the topic, and set approx_sync to false. You may also explicitly set the frame_id to base frame of the drone.

matlabbe gravatar image matlabbe  ( 2021-10-10 15:23:59 -0600 )edit

ok, I think I have found a clue. When looking at the depth cloud generated by the camera system, I am seeing this correct depth estimation, However the rtab grid map is very scewed. I cannot figure out why this is happening.

kiprock gravatar image kiprock  ( 2021-10-13 15:32:30 -0600 )edit

1 Answer

Sort by » oldest newest most voted

answered 2021-10-13 19:32:43 -0600

kiprock gravatar image

updated 2021-10-14 16:29:59 -0600

OK, I think I sorted this, for anyone else in this situation. Basically the intrinsics between the depth and rgb cameras does not match, resulting in misalignment between depth and rgb images. I'm using the Realsense Gazebo plugin in ROS. Normally you are supposed to set the flag align_depth=true for the real camera, however the simulated one does not have this option as far as I can see. So instead, I went in and set the FOV of the RGB camera to the same as the depth cameras in the realsense's model.sdf:


Then I set the rtab_map's camera info topic to the depth cam info topic:

<arg name="camera_info_topic" value="/camera/camera/depth/camera_info"/>

Now the depth and color images are aligned and I have a much nicer result:

image description

Note that I am using the mavros provided odom topic (calculated by the FCU/PX4). I have never seen anyone else use this odom topic for RTAB_MAP, but then again, there are really not many examples using RTAB_MAP and PX4 that I can find. Ultimately I want to simulate flight in an indoor, non-gps environment, but I had to turn on the GPS in the simulation, otherwise the drone will not hold position (even though I have an optical flow sensor on the model). In real life I can hold position ok without GPS. Maybe someone out there has the solution for this.


When I close the loop, the map falls apart. Why would this happen?

Here is a video


edit flag offensive delete link more


Nice find. You could try rtap-map odometry (vision based). I also found the issue explained in the docs. It states "... allow vehicles to navigate when a global position source is unavailable or unreliable (e.g. indoors, ..."

Dragonslayer gravatar image Dragonslayer  ( 2021-10-14 05:17:30 -0600 )edit

Could be due to sync problems, see this Its not exactly the same problem as you have, due to it being generating this way from the get go, however the problem materializes as your computer seems to start lagging, this could be a clue. Maybe try a simpler gazebo world to free up cpu resources and see if the issue/lagging persists. I know I brought this up already, but did you try rtab-maps odometry yet? Might likely be synced internally. Additional interessting reads: link textlink text

Dragonslayer gravatar image Dragonslayer  ( 2021-10-15 06:40:22 -0600 )edit

I think that the odometry provided by the FCU is good, but I will try rtap-map odometry. In real life I would have to use something like that. I think that my problem here is that when the loops are detected, rtab starts publishing TF offsets between map2 and map, my odometry is effectively doubling up. When I first start rotating the drone, Map and Map2 are exactly aligned, but when loops close, map2 jumps around, causing distortion. See this TF-Tree I have publish_tf =false Here is what happens when map2 jumps around.. the rtab-map no longer aligns to the depth See this image of map

kiprock gravatar image kiprock  ( 2021-10-15 14:59:02 -0600 )edit

For the problem in the video, see my comment here

matlabbe gravatar image matlabbe  ( 2021-10-15 16:06:10 -0600 )edit

The tf_tree looks suboptimal. Usually its map -> odometry -> base_footprint(projection of base_link onto a plane/floor in z dimension) -> base_link -> etc.. One should not have a second tf_tree with a single robot. odom -> odom_ned? Iam not sure about this setup but I came across some nodes that actually used "hardcoded" topics, specially odometry. When I look at your picture, I can remember way back I had that issue as well. I think to remember it was about frame_id in the end. A quick search brought up this Did you set "frame_id" to "map" instead of base_link?

Dragonslayer gravatar image Dragonslayer  ( 2021-10-16 06:11:22 -0600 )edit

Thanks again for your comments. I should mention that this is a little different from a rover setup, because PX4/mavros automatically creates the odom->odom ned frame, even though it is not being referred to by any topics, and from what I see, can be ignored. There is a lot of confusion out there as to why this is the case, and can be very well summed up in this issue. If the px4 is not localized, you get these three disconnected trees, and if it is localized by any means (optical flow, gps, whatever), a link is created between map and base_link. No topics seem to refer to the odom frame and so I don't know why they publish it. Anyway, add rtab_map to the mix and you get the map2 frame above everything,

kiprock gravatar image kiprock  ( 2021-10-18 12:32:12 -0600 )edit

and once loops begin to close, the rtab map jumps around as I mentioned above. I do have frame_id=base_link .I know this is a unusual case, but I believe many people would like to use rtab-map with px4+realsense. Ultimately, my goal is to use rtab_map to localize the drone once a map has been created. This can be achieved by sending vio pose data to the drone via this process.

kiprock gravatar image kiprock  ( 2021-10-18 12:42:18 -0600 )edit

I suppose, while thinking this over a bit more, I should turn off GPS and rely on local camera-based odometry methods during the mapping process, as that is the ultimate goal anyway, simulated indoor flight with object avoidance.

kiprock gravatar image kiprock  ( 2021-10-18 13:59:46 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower


Asked: 2021-10-08 21:19:51 -0600

Seen: 52 times

Last updated: Oct 14