ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
2

RTABMAP with two stereo cameras

asked 2017-08-23 14:17:51 -0500

psammut gravatar image

Hello, first of all thanks to Mathieu Labbe for all his hard work on rtabmap and for bringing it to ROS!

I would like to use two stereo cameras to aid in SLAM and also path planning. I have been experimenting with one camera and I found that if I point it forward, SLAM works well but the path planner doesn't because it is guessing about obstacles at the feet of the robot. If I point the camera down path planning works well but mapping suffers.

Id like to use two stereo cameras increase path planner obstacle detection accuracy and maintain good map building abilities.

image description

Question is, what is the best way of doing this? rtabmap seems to be written to subscribe to only one stereo camera. Should I combine the point clouds generated by stereo_image_proc?

Also I'm unsure as to how I should use my wheel odometry in this setup. It looks like rtab can only use one odometry source. Should I just do visual odometry if it works well enough and pipe that into rtab? I'm experimenting right now with wheel odometry data and one stereo camera with rtab and I'm slightly unsure on how I should connect these systems together.

Thanks!

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
5

answered 2017-08-24 17:12:31 -0500

matlabbe gravatar image

I see two setups that could be possible:

  1. Use the top camera for SLAM (global occupancy grid map) and the bottom camera to update the obstacle layer of the costmaps. That way the local and global planners will plan also accordingly to obstacles detected by the bottom camera. To segment the cloud from stereo_image_proc of the bottom camera into obstacles, you can use rtabmap_ros/obstacles_detection nodelet.

  2. Use both cameras for rtabmap and visual odometry nodes. To use multiple stereo cameras with rtabmap, they should have the same resolution and they should be converted in RGB-D cameras using stereo_image_proc to generate the disparity image (rtabmap_ros/disparity_to_depth nodelet can be used for convenience to convert to depth image). This is similar to this setup. When you have a RGB image, a Depth image and a camera info for each camera, see the demo_two_kinects.launch example (subscribe_rgbd=true and rgbd_cameras=2). With stereo cameras, make sure to set approx_sync parameter to false for rtabmap_ros/rgbd_sync nodelets. TF between cameras and base_link should be also accurate to give good results.

For odometry, rtabmap uses only one input, so you have to choose between wheel odometry and visual odometry. To mix two odometry sources together, see robot_localization package. The output of robot_localization can then be used as input to rtabmap.

Cheers,

Mathieu

edit flag offensive delete link more

Comments

Sounds great, thanks Mathieu. I found your answer to the odometry input question in another post, sorry for repeating the question!

psammut gravatar image psammut  ( 2017-08-24 18:47:19 -0500 )edit

One more q: in Stereo Outdoor Navigation you detail how to split the cloud in ground and obstacles. Why do this when you can send the whole cloud and just set min_obstacle_height in costmap_common_params? Thanks!

psammut gravatar image psammut  ( 2017-08-25 09:22:00 -0500 )edit

In that example, the outdoor ground is not a perfect plane. In particular, when there is a transition on the terrain before a slope, we don't want the robot thinking the slope is an obstacle. But yes, indoor you may just use min_obstacle_height with the raw/downsampled point cloud.

matlabbe gravatar image matlabbe  ( 2017-08-25 09:58:34 -0500 )edit

Hello, I have multiple rgbd camera setup on a mobile platform. I am able to map the environment using teleoperation using the demo_two_kinects.launch but how do I setup the obstacle detection nodelet manager for multi camera setup ?

Anirvan0102 gravatar image Anirvan0102  ( 2019-05-13 07:06:51 -0500 )edit

See this link, with observation_sources: point_cloud_sensor1 point_cloud_sensor2

matlabbe gravatar image matlabbe  ( 2019-05-15 16:19:03 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2017-08-23 14:17:51 -0500

Seen: 2,102 times

Last updated: Aug 24 '17