Ask Your Question
1

Running Turtlebot with viso2_ros (Linear odometry now working)

asked 2012-07-17 11:14:14 -0500

YumiLench gravatar image

updated 2012-07-17 14:47:43 -0500

We are running stereo_odometer in viso2_ros on the turtlebot. We are running ROS Electric on Ubuntu 11.10.

We disabled the /odom publish in turtlebot_node.py, and let stereo_odometer publish to /odom. robot_pose_ekf will now subscribe to the odometry message from viso2_ros.

However, when we visualized it in rviz, the base_link frame only turns but it does not register any linear movements.

We set the "publish_tf" equal to "false", because robot_pose_ekf (not stereo_odometer) is supposed to compute the tf based on the odometry information from viso2_ros.

Does anyone have any suggestions on how to debug this?

Since the angular odometry is performing fine, we think the visual odometry is up and running well... but we're not sure how to debug a potential issue with transforms (tf) or publishing/subscribing

(we did write a custom tf between the base_link and our pair of stereo cameras, that seems to be correct by looking at rviz)

New info: We found out that the problem is in the library of libviso2. When the robot is stationary, the process method of VisionOdometryStereo returns true, but when the robot starts to move, it returns false. Will bad camera calibration or bad camera placement (we used 2 identical USB cameras placed side by side) cause the problem?

edit retag flag offensive close merge delete

3 Answers

Sort by ยป oldest newest most voted
2

answered 2012-07-17 20:13:05 -0500

Looking at the code, it appears that false is returned when the number of valid feature matches between the left and right stereo frames is lower than 6 (see here). Have you calibrated your camera properly and do you use rectified images as input for viso2? Also, are camera images somehow synchronized and not too blurry? I imagine the combination of bad synchronization and rolling shutter could be problematic.

Testing viso2 with a custom stereo system (with synced frames and calibrated using StereoCalibration) showed promising results for us (including linear/translational motion).

edit flag offensive delete link more
1

answered 2012-07-17 22:49:38 -0500

Stephan gravatar image

First make sure your stereo camera system is working correctly. When you are able to generate good looking disparity maps and/or point clouds (e.g. by following the tutorial on choosing good stereo parameters) the stereo_odometer of viso2_ros should work fine. viso2_ros does not use the disparity map nor the point cloud that come from stereo_image_proc but if they look good your stereo system is working.

To debug your tf tree, try the following: do not publish any tf and run just the stereo_odometer with the parameter base_link_frame_id set to the frame_id of your stereo system. Visualize the transform /odom -> <frame_id of your camera> in rviz by setting the fixed frame to /odom. Does the frame move when the camera moves? If not, stereo_odometer is not working for some reason (too few features, inconsistent matches). You might try to adjust the parameter match_disp_tolerance.

If the tf /odom -> <frame_id of your camera> looks good there must be some other problem in your tf tree.

A side-note: I just moved the former srv_vision stack that contained viso2_ros to a new stack called viso2, please update your repository.

edit flag offensive delete link more
0

answered 2012-07-18 09:19:22 -0500

YumiLench gravatar image

Thank you guys for helping out!

We rectified the images using stereo_image_proc, and then we recalibrated the cameras after securing them better to a fixed position and went for a lower epi error. It works well now!

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

Stats

Asked: 2012-07-17 11:14:14 -0500

Seen: 704 times

Last updated: Jul 18 '12