ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

3D reconstruction with stereo cameras

asked 2017-09-30 11:38:16 -0500

MateToth gravatar image

updated 2017-10-02 11:47:41 -0500

Dear ROS community!

I've started to built a stereo camera system to reconstruct the field of view. I use two Logitech C270 webcameras on a base stand to get the image streams. (Here you can see the prototype: )

For the project it's necessary to hold the camera optics as close as I can, so I've turned one camera vertically. I use video_stream_opencv package to get and rotate the images and also to send them to the other nodes.

Because of the further operations and to save some hardware resources, I thought it's necessary to synchronize the images' and camera info's timestamps before calibration, rectification etc., so I've created a synchronization node which uses approximate synchronization between the image frames and camera info messages, and it also republishes the data with the same timestamps. I thought that after the synchronization won't be necessary to use the approx_sync, but I think I was wrong. To test the system I also started to use a static Tf publisher.

Anyway, I couldn't get out some point cloud from the system, however in the terminal a warning message appears frequently:

odometry: Could not get transform from base_link to left (stamp=1506788450.735916) after 0.200000 seconds ("wait_for_transform_duration"=0.200000)! Error=". canTransform returned after 0.202835 timeout was 0.2.

I suspect to the static TF publisher, but I'm not sure.

Here is my launch file:

<launch> <arg name="fps" default="25"/>

<!-- Synchronization -->
<arg name="syncronizer_namespace"       default="/syncronizer"/>
<arg name="left_camera_raw"             default="$(arg syncronizer_namespace)/left" />
<arg name="right_camera_raw"            default="$(arg syncronizer_namespace)/right" />
<arg name="left_camera_info_topic"      default="$(arg syncronizer_namespace)/left/camera_info" />
<arg name="right_camera_info_topic"     default="$(arg syncronizer_namespace)/right/camera_info" />

<!-- Stereo -->
<arg name="stereo_namespace"            default="/stereo_camera"/>
<arg name="left_image_topic"            default="$(arg stereo_namespace)/left/image_rect" />
<arg name="right_image_topic"           default="$(arg stereo_namespace)/right/image_rect" />

<arg name="approx_sync"                 default="true"/>
<arg name="queue_size"                  default="5"/>

<!-- Tranfsorm -->
<arg name="use_static_transform"        default="true"/>

<!-- 2D visualization -->
<arg name="visualize"                   default="false"/>
<arg name="show_raw"                    default="true"/>
<arg name="show_disparity"              default="true"/>

<!-- Visual SLAM -->
<arg name="frame_id"                    default="base_link"/>   <!-- Fixed frame id, set "base_link" or "base_footprint" if they are published -->
<arg name="rtabmap"                     default="true"/>
<arg name="odometry"                    default="true"/>

<!-- Odometry -->
<arg name="odom_frame_id"               default="odom"/>            <!-- If set, TF is used to get odometry instead of the topic -->
<arg name="ground_truth_frame_id"       default=""/>            <!-- e.g., "world" -->
<arg name="ground_truth_base_frame_id"  default=""/>            <!-- e.g., "tracker", a fake frame matching the frame "frame_id" (but on different TF tree) -->
<arg name="wait_for_transform"          default="true"/>
<arg name="wait_for_transform_duration" default="0.2"/>

<!-- 3D visualization -->
<arg name="rviz"                        default="false"/>
<arg name="rtabmapviz"                  default="true"/>

<arg name="localization"            default="false"/>
<arg name="time_threshold"          default="0"/>
<arg name="optimize_from_last_node" default="false"/>
<arg name="launch_prefix"           default=""/>
<arg name="convert_depth_to_mm"     default="true"/>
<arg name="subscribe_scan"          default="true"/>
<arg name="subscribe_scan_cloud"    default="false"/>
<arg name="scan_cloud_topic"        default="/scan_cloud"/>
<arg name="visual_odometry"         default="true"/> 

<arg name="camera_info" default="camera_info"/>

<!-- Core functionality ***********************************************************************-->

<!-- Camera -->
<group ns="/camera">
    <node pkg="nodelet" type="nodelet" name="stereo_camera_nodelet ...
edit retag flag offensive close merge delete


<node .. type="static_transform_publisher" .. args=".. /base_link /camera_link 100" />

I don't know if it's the cause, but the last arg there is the period, not the rate. So static_transform_publisher (STP) is publishing at 10 Hz here.

Try STP from tf2.

gvdhoorn gravatar image gvdhoorn  ( 2017-10-01 06:00:03 -0500 )edit

I tried to use tf2 but unfortunately it didn't help.

MateToth gravatar image MateToth  ( 2017-10-02 12:25:20 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2017-10-02 12:41:13 -0500

matlabbe gravatar image

updated 2017-10-02 12:43:28 -0500

What is your TF tree? ($ rosrun tf view_frames)

As the error is saying, odometry cannot transform frame left to frame base_link. You may want to add a static transform between camera_link and left as this:

<arg name="pi/2" value="1.5707963267948966" />
<node if="$(arg use_static_transform)" pkg="tf" type="static_transform_publisher" name="camera_optical" 
      args="0 0 0 -$(arg pi/2) 0 -$(arg pi/2) /camera_link /left 100" />

... so you have base_link -> camera_link -> left.

Note that you don't need to pre-synchronize as stereo_odometry and rtabmap nodes can do it directly with approx_sync:=true. But yes, if you pre-syncrhonize and set the same timestamp on all image/camera_info topics, you can use approx_sync:=false for odometry and rtabmap. If you have poor results after that, it may be caused by bad stereo rectification and/or synchronization. You may add to your question an example of left/right images (as well as the disparity image generated from stereo_image_proc) to see if they seem correctly rectified at least.

I strongly suggest that you get a real stereo camera that does stereo hardware (not software!) synchronization to get good results when the robot is moving.



edit flag offensive delete link more


I thought the same thing. I copied your code into my launch file, and after the modification I didn't get that warning message again, and now I see the point cloud! Thank you!

I will use a real stereo camera, but now that prototype is also good for me. Thanks again for your help and suggestions!

MateToth gravatar image MateToth  ( 2017-10-02 14:44:30 -0500 )edit

Question Tools

1 follower


Asked: 2017-09-30 11:38:16 -0500

Seen: 2,955 times

Last updated: Oct 02 '17