Ask Your Question

Inaccuracy in depth pointcloud from RGB-D camera

asked 2018-10-25 04:19:27 -0500

percy_liu gravatar image

updated 2018-10-25 21:59:00 -0500


I am using ROS Kinetic. I am trying to get pointcloud of a door from RGB-D camera on a robot. The description is shown below.

<?xml version="1.0"?>

<robot xmlns:xacro="">  
     The asus_camera_model macro only adds the model, it does not also add
     the openni gazebo plugin. See the 'asus_camera' macro below for that
    <xacro:macro name="realsense_camera_model" params="name parent *origin">
      <joint name="${name}_joint" type="fixed">
        <xacro:insert_block name="origin" />
        <parent link="${parent}"/>
        <child link="${name}_link"/>

      <link name="${name}_link">
          <mass value="0.200" />
          <origin xyz="0 0 0" rpy="0 0 0" />
          <inertia ixx="5.8083e-4" ixy="0" ixz="0" iyy="3.0833e-5" iyz="0" izz="5.9083e-4" />
          <origin xyz="0 0.051 0" rpy="0 0 0" />
            <box size="0.007 0.130 0.02" />
          <material name="DarkGrey">
            <color rgba="0.3 0.3 0.3 1"/>
          <origin xyz="0 0.051 0" rpy="0 0 0" />
            <box size="0.007 0.130 0.02" />

      <gazebo reference="${name}_link">

     The asus_camera macro only adds the model, and also adds
     the openni gazebo plugin.
   <xacro:macro name="realsense_camera" params="name parent *origin">
     <xacro:realsense_camera_model name="${name}" parent="${parent}">
       <xacro:insert_block name="origin" />

    <gazebo reference="${name}_link">
      <sensor type="camera" name="${name}_rgb">
          <horizontal_fov>${70.0 * pi/180.0}</horizontal_fov>
            <center>0.5 0.5</center>
        <plugin name="${name}_camera_depth_controller" filename="">
          <robotNamespace>$(arg robot_name)/sensor/${name}/rgb</robotNamespace>

      <sensor type="depth" name="${name}_depth">
          <horizontal_fov>${59.0 * pi/180.0}</horizontal_fov>
        <plugin name="${name}_camera_rgb_controller" filename="">
          <robotNamespace>$(arg robot_name)/sensor/${name}</robotNamespace>

The depth camera has three topics:


But the generated pointcloud is a little bit left to the real door. Why there is difference in the location, and how to eliminate the difference?

Thanks a lot

edit retag flag offensive close merge delete


"But the generated pointcloud is a little bit left to the real door" Could you clarify that a bit? How did you measure this offset? Is the camera mounted on a robot?

NEngelhard gravatar image NEngelhard  ( 2018-10-25 05:50:42 -0500 )edit

My first guess would be that the transform that defines the location of the RGB-D sensor on the robot is in slightly the wrong place or has the wrong angle, this will be very sensitive to angular errors. As @NEngelhard asked, the point cloud is in the wrong place relative to what? the image?

PeteBlackerThe3rd gravatar image PeteBlackerThe3rd  ( 2018-10-25 06:51:05 -0500 )edit

I also used LIDAR to generate pointcloud of the door, and I believe the LIDAR is accurate. The differences are between these two pointclouds by different sensors. I cannot upload a picture due to limited points. But the link should be working.

percy_liu gravatar image percy_liu  ( 2018-10-25 20:45:52 -0500 )edit

I wonder who downvotes such a question...

percy_liu, I see a comment about a "asus_camera" macro, but not the macro itself. Also the link does not work for me.

Humpelstilzchen gravatar image Humpelstilzchen  ( 2018-10-26 01:06:50 -0500 )edit

If you have different sensors but the data do not match, wouldn't that point to (at best) a need for extrinsic calibration of both sensors?

gvdhoorn gravatar image gvdhoorn  ( 2018-10-26 04:44:02 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2018-10-26 09:48:56 -0500

As @gvdhoorn has said this looks like you're extrinsic calibration of the two sensors is incorrect, fixing this should sholve your problem.

The extrinsic calibration defines the position and orientation of your sensors within your robot, if the relative position defined in your robot description is not the same as the real robot then your will see alignment like yours. A good start would be double checking your measurements and the optical focus points of the sensors so make sure they're correct.

Hope this helps.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools



Asked: 2018-10-25 04:19:27 -0500

Seen: 335 times

Last updated: Oct 26 '18