ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Noisy Occupancy Map with Zed Camera

asked 2016-10-28 11:28:47 -0500

Hi, I am trying to create an occupancy grid using a ZED stereo camera along with rtabmap_ros. My results are very noisy and I am wondering if the problem is with the ZED camera, or with my rtabmap_ros config, or both. Can anyone verify whether or not rtabmap can deal with this level of sensor noise.


  • ZED camera calibrated using the calibration tool found in /usr/local/zed
  • Running on NVIDIA TX1, Ubuntu 14.04, ROS Indigo
  • Indoor well light room, ZED mounted on mobile robot being joysticked


Here is a sample of a map I have generated image description

And here is an example of the depth cloud at one instant image description

As you can see the depth cloud is fairly noisy, but what sensor data isn't.

Launch File:


  <include file="$(find zed_wrapper)/launch/zed_tf.launch" />

  <arg name="svo_file" default=""/>

  <group ns="camera">
    <node name="zed_wrapper_node" pkg="zed_wrapper" type="zed_wrapper_node" args="$(arg svo_file)" >

      <param name="resolution"            value="2" />
      <param name="quality"               value="1" />
      <param name="sensing_mode"          value="1" />
      <param name="frame_rate"            value="30" />
      <param name="odometry_DB"           value="" />
      <param name="openni_depth_mode"     value="0" />

      <param name="rgb_topic"             value="rgb/image_rect_color" />
      <param name="rgb_cam_info_topic"    value="rgb/camera_info" />
      <param name="rgb_frame_id"          value="/zed_link" />

      <param name="left_topic"            value="left/image_rect_color" />
      <param name="left_cam_info_topic"   value="left/camera_info" />
      <param name="left_frame_id"         value="/zed_link" />

      <param name="right_topic"           value="right/image_rect_color" />
      <param name="right_cam_info_topic"  value="right/camera_info" />
      <param name="right_frame_id"        value="/zed_link" />

      <param name="depth_topic"           value="depth/image_rect_color" />
      <param name="depth_cam_info_topic"  value="depth/camera_info" />
      <param name="depth_frame_id"        value="/zed_link" />

      <param name="point_cloud_topic"     value="point_cloud/cloud" />
      <param name="cloud_frame_id"        value="/zed_link" />

      <param name="odometry_topic"                value="odom" />
      <param name="odometry_frame_id"             value="/zed_link" />
      <param name="odometry_transform_frame_id"   value="/zed_tracked_frame" />

      <param name="visual_odometry"       value="false" />
      <param name="odometry_topic"            value="/odometry/filtered_map" />


      <!--Visual SLAM: args: "delete_db_on_start" and "udebug" -->
      <node name="rtabmap" pkg="rtabmap_ros" type="rtabmap" output="screen" args="--delete_db_on_start" >
         <param name="frame_id"         type="string" value="base_link"/>
         <param name="subscribe_stereo" type="bool" value="false"/>
         <param name="subscribe_depth"  type="bool" value="true"/>

         <remap from="rgb/image"  to="/camera/rgb/image_rect_color"/>
         <remap from="rgb/camera_info"  to="/camera/rgb/camera_info"/>
         <remap from="/depth/image"   to="/camera/depth/image_rect_color"/>
         <remap from="odom" to="/odometry/filtered_map"/>

         <param name="queue_size" type="int" value="30"/>

         <!-- RTAB-Map's parameters -->
         <param name="Rtabmap/TimeThr"                   type="string" value="700"/>
         <param name="Grid/DepthDecimation"              type="string" value="4"/>
         <param name="Grid/FlatObstacleDetected"         type="string" value="true"/>
         <param name="Kp/MaxFeatures"                    type="string" value="200"/>
         <param name="Kp/MaxDepth"                       type="string" value="10"/>
         <param name="Kp/DetectorStrategy"               type="string" value="0"/>   <!-- use SURF -->
         <param name="SURF/HessianThreshold"             type="string" value="1000"/>
         <param name="Vis/EstimationType"                type="string" value="0"/>   <!-- 0=3D->3D, 1=3D->2D (PnP) -->
         <param name="RGBD/LoopClosureReextractFeatures" type="string" value="true"/>
         <param name="Vis/MaxDepth"                      type="string" value="10"/>

edit retag flag offensive close merge delete


Can you please tell in detailed steps me how I can run your launch file..since I am a novice thanks in advance

ankursingh0820 gravatar image ankursingh0820  ( 2017-07-18 04:13:20 -0500 )edit

I got this launch file from here ( ). Follow the steps on that page to run the launch file. Let me know if you have any questions.

shoemakerlevy9 gravatar image shoemakerlevy9  ( 2017-07-18 09:09:54 -0500 )edit

ZED (Init) >> WARNING: FPS is too low to enable positional tracking and spatial mapping. Consider using PERFORMANCE parameters found this error,fps-30,but usb is 2.0

ankursingh0820 gravatar image ankursingh0820  ( 2017-07-18 10:37:02 -0500 )edit

In zed_camera.launch change the quality value to 1 if it isn't already. What computer are you running the ZED camera on? It needs a decent GPU to run.

shoemakerlevy9 gravatar image shoemakerlevy9  ( 2017-07-18 10:56:36 -0500 )edit

That's Xeon 8gb, with nvidia gtx580 with gpu memory of 1.5gb memory

ankursingh0820 gravatar image ankursingh0820  ( 2017-07-18 11:24:53 -0500 )edit

hi im also a colleague of ankursingh0820 i am running it on a jetson tx1.I still get the warning FPS is too low.But there is no output generated

neilj244 gravatar image neilj244  ( 2017-07-21 00:40:16 -0500 )edit

2 Answers

Sort by ยป oldest newest most voted

answered 2016-11-04 11:50:58 -0500

updated 2016-11-04 11:51:17 -0500

Change the zed_wrapper_node quality parameter to "3" for the highest quality depth image.

0: None
1: Performance
2: Medium
3: Quality


edit flag offensive delete link more

answered 2016-10-30 16:20:14 -0500

matlabbe gravatar image

You may look at the ZED parameters to reduce depth interpolation. There is a confidence threshold that can be modified dynamically (use rqt_reconfigure_gui) that would reduce the noisy points on textureless areas.

Note that indoor, stereo cameras (like the ZED) would give poor occupancy grid maps, in comparison to RGB-D sensors (like the Kinect).


edit flag offensive delete link more


Thanks, I will look into using rqt_reconfigure_gui. As far as indoor/outdoor goes I'm intending to use my robot primarily outdoors, but am doing testing inside first. Thats good to hear that I may actually see a boost in performance when I take it outside, away from the untextured world it's in now.

shoemakerlevy9 gravatar image shoemakerlevy9  ( 2016-11-02 14:35:37 -0500 )edit

After changing my performance to Quality I needed to turn the confidence threshold up to 100. In order to do this programmatically so that its persistent I added this to my launch file: (See next comment)

shoemakerlevy9 gravatar image shoemakerlevy9  ( 2016-11-09 15:41:07 -0500 )edit

<node name="$(anon dynparam)" pkg="dynamic_reconfigure" type="dynparam" args="set_from_parameters zed_wrapper_node" output="screen"> param name="confidence" type="int" value="100" </node>

shoemakerlevy9 gravatar image shoemakerlevy9  ( 2016-11-09 15:41:21 -0500 )edit

Question Tools



Asked: 2016-10-28 11:28:47 -0500

Seen: 1,925 times

Last updated: Nov 04 '16