# Realsense R200 with rtabmap rgbd odometry

Hello, I'm working with the intel realsense R200 camera with the objective of getting visual odometry. For this, I'm using the realsense ROS SDK from Intel along the rtabmap_ros rgbd_odometry node. I've verified that the color and depth camera data are coming out correctly through rviz but most of the time, when I start rgbd_odometry, I get one of these errors and do not get an odometry stream:

[ WARN] (2015-10-10 14:15:11.873) OdometryBOW.cpp:317::computeTransform() Local map too small!? (15 < 20)
[ WARN] (2015-10-11 00:05:41.095) OdometryBOW.cpp:304::computeTransform() Not enough correspondences (0 < 20)
[ WARN] (2015-10-11 00:05:41.233) OdometryBOW.cpp:308::computeTransform() Not enough inliers (9 < 20)


Sometimes, odometry works fine when I start the rgbd_odometry node, but as I move the camera, the warnings eventually show up. At first I thought this was specific to when the camera is facing far off targets(>2m) but it seems to happen even when the camera is facing something close by(<1m). My guess is that it is not able to identify enough features in the frame but wondering if it could be something else since I think there are a lot of features in my test area (it's not yet on a mobile robot). Can anyone shed light on what these warnings mean? I'm new to ROS so could also be missing something.

Image size is 320x240 at 30 frames per sec. Set up is Ubuntu 14.04 with ROS_Indigo. Running on 4th gen i5 and no external GPU.

Sample bag file

edit retag close merge delete

Sort by » oldest newest most voted

Hi,

You can read the Odometry Lost section of this tutorial: https://github.com/introlab/rtabmap/w... .

There seems to be a lack of features. Some parameters that you can tune:

<param name="Odom/MaxDepth"            type="string" value="0"/>
<param name="Odom/InlierDistance"      type="string" value="0.1"/> <!-- when EstimationType=0 -->
<param name="Odom/EstimationType"      type="string" value="0"/> <!-- 0=3D->3D, 1=3D->2D (PnP) -->


Odom/EstimationType=1 generates more inliers, though not necessary more precise odometry.

If you can record a small rosbag (like 10 sec), I could check it more deeply:

$roslaunch realsense realsense_r200_launch.launch$ rosbag record camera/depth/image_raw/compressedDepth camera/color/image_raw/compressed  camera/color/camera_info


UPDATE I tried your rosbag like this:

$roscore$ rosparam set use_sim_time true
$rosrun tf static_transform_publisher 0 0 0 -1.5707963 0 -1.5707963 base_link camera_color_optical_frame 100$ rosrun tf static_transform_publisher 0 0 0 0 0 0 camera_color_optical_frame camera_depth_optical_frame 100
$roslaunch rtabmap_ros rgbd_mapping.launch rgb_topic:=/camera/color/image_raw depth_registered_topic:=/camera/depth/image_raw camera_info_topic:=/camera/color/camera_info compressed:=true frame_id:=base_link rtabmap_args:="--delete_db_on_start" estimation:=1$ rosbag play --clock 2015-10-20-22-55-08.bag


Notes:

• While the depth is registered to color, their frame_id could be the same (here I put null transform between color and depth frames).
• On live experiments with the sensor, you can remove compressed:=true.

Result:

I have similar results with estimation:=0 too. estimation argument of rgbd_mapping.launch is in reality the Odom/EstimationType parameter of the odometry node:

$rosrun rtabmap_ros rgbd_odometry --params | grep Odom/EstimationType Param: Odom/EstimationType = "0" [Motion estimation approach: 0:3D->3D, 1:3D->2D (PnP)]  • Odom/EstimationType=0: Transformation estimation is done with 3D feature correspondences, which minimizes the euclidean distance between the inliers. Parameter Odom/InlierDistance=0.1 can be tuned to include more inliers or to be more precise. • Odom/EstimationType=1: Transformation estimation is done with 3D->2D feature correspondences, which minimizes the reprojection error on the next frame. Parameter Odom/PnPReprojError=5.0 can be used to include more inliers or to be more precise. The advantage of PnP is that the depth values for the next image are not required for the estimation, thus if there are features which don't have depth values, they will be still used. In the previous approach, they cannot be used as the depth is required. cheers more ## Comments Thanks for your response! I was finally able to try out adjusting the EstimationType and see that odometry gets lost less than before. How does this parameter adjust the odometry algorithm? I've added a bag file to my question. ( 2015-10-21 00:10:55 -0500 )edit Updated the answer! ( 2015-10-21 10:26:23 -0500 )edit Hi, is there anything else you did with the RealSense Camera to get this working? I can play the bag file correctly, but when I try to use my camera (both in live and with a bag), nothing shows up on rtabmap. The camera input/driver is working; rviz works correctly. Thanks! ( 2015-12-23 09:07:44 -0500 )edit Here's the command I use to start realsense: roslaunch realsense realsense_r200_launch.launch dHeight:=240 dWidth:=320 dFPS:=30 cHeight:=240 cWidth:=320 cFPS:=30 Note that the width/height of the depth and rgb camera are the same since that's what rgbd_odometry expects. ( 2015-12-23 22:10:22 -0500 )edit Hi, I am following your tips and get the realsense and rtabmap_ros running, but I got this warning: I published the tf exactly as rosrun tf static_transform_publisher 0 0 0 -1.5707963 0 -1.5707963 base_link camera_color_optical_frame 100$ rosrun tf static_transform_publisher 0 0 0 0 0 0 camera_color_optical_frame camera_depth_optical_frame 100

and ran in a live mode. I didn't get any result in rtabmapviz.

more