Ask Your Question

Why such a bad map using a depth camera?

asked 2019-03-01 11:00:04 -0500

EdwardNur gravatar image

updated 2019-03-06 04:45:31 -0500

I recently bought D435 (Intel realsense) and used rtabmap package to perfrom SLAM.

I simply used this roslaunch:

And the result is this:

image description

The environment is this:

image description

Any ideas?


edit retag flag offensive close merge delete


Was this running on a real robot? Not in gazebo?

kitkatme gravatar image kitkatme  ( 2019-03-03 07:19:43 -0500 )edit

@kitkatme Yes, but worth to note that I just used my camera without any external odometry.

EdwardNur gravatar image EdwardNur  ( 2019-03-03 07:23:35 -0500 )edit

Is the camera immobile or you moved the camera? What kind of trajectory? If the camera is not moving, the noise in the cloud is coming from the sensor.

matlabbe gravatar image matlabbe  ( 2019-03-03 20:54:54 -0500 )edit

@matlabbe I did move the camera forward

EdwardNur gravatar image EdwardNur  ( 2019-03-05 03:21:34 -0500 )edit

Can you share the resulting database? (default ~/.ros/rtabmap.db)

matlabbe gravatar image matlabbe  ( 2019-03-05 10:19:46 -0500 )edit

@matlabbe I have added DB

EdwardNur gravatar image EdwardNur  ( 2019-03-06 04:45:47 -0500 )edit

2 Answers

Sort by ยป oldest newest most voted

answered 2019-03-06 08:15:54 -0500

matlabbe gravatar image

updated 2019-03-06 08:18:01 -0500

Thx for the db, there are some observations that we can make (referring to figure below):

  • The camera is very close to ground, so that the field of view is filled with more than 50% of repetitive symmetric pattern, very difficult to get good depth estimation as well for visual odometry. It is why the ground is poorly represented in the occupancy grid.
  • The resulting depth image from the sensor is quite noisy (see top view of the cloud on right image). I suggest to use Grid/3D=false and Grid/RayTracing=true to get more empty cells (see grid on left image):

    $ roslaunch rtabmap_ros rtabmap.launch rtabmap_args:="--delete_db_on_start --Grid/3D false --Grid/RayTracing true" depth_topic:=/camera/aligned_depth_to_color/image_raw rgb_topic:=/camera/color/image_raw camera_info_topic:=/camera/color/camera_info approx_sync:=false
  • There is quite motion blur when the robot moves (see top middle image, open image in another tab to zoom in), which would affect visual odometry accuracy.

image description

Possible solutions:

- Tilt the camera toward the ceiling (and/or increase height) so that there is less ground in the FOV of the camera. Make sure to update TF as well between base_link and camera_link.

  • Move slower to limit motion blur

  • Consider using wheel odometry if possible

edit flag offensive delete link more


@matlabbe That is what I thought, probably ground is too close. Thank you for your help.

EdwardNur gravatar image EdwardNur  ( 2019-03-06 08:39:15 -0500 )edit

answered 2019-03-03 07:32:59 -0500

kitkatme gravatar image

Looks like the effect of poor localization, which makes sense as you have pointed out that you did not use any external odometry. So it doesn't matter what kind of sensor you have, if you don't use a good localization method and carefully determined sensor offsets from the center of motion of the vehicle, then you won't get good maps.

I suggest finding the camera to vehicle IMU frame transform and then using the external odometry option mentioned in the tutorial.

edit flag offensive delete link more


Hm, you might be right but it seems hard to believe that visual odometry is that bad.

EdwardNur gravatar image EdwardNur  ( 2019-03-03 07:36:08 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower


Asked: 2019-03-01 11:00:04 -0500

Seen: 673 times

Last updated: Mar 06 '19