ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

How to use good parameter in rtabmap_ros (stereo_mapping.launch)

asked 2016-02-27 17:50:55 -0600

Greenout gravatar image

I get good parameter on rqt_reconfig this dont have speckle but when roslaunch rtabmap_ros stereo_mapping.launch stereo_namespace:="/stereo_camera" rtabmap_args:="--delete_db_on_start" approximate_sync:=true My map have speckle how to use parameter on mapping image description

edit retag flag offensive close merge delete

Comments

What do you mean when you say "I get good parameter on rqt_reconfig" ? Are you using rqt_reconfig to modify the default parameters?

ahendrix gravatar image ahendrix  ( 2016-02-27 20:07:55 -0600 )edit

Yes, result is good (/stereo_camera/points2 ) but when i launch rtabmap_ros stereo_mapping.launch ,i get voxel_cloud ,it have speckle yet why`? stereo_camera/points2 what to do

Greenout gravatar image Greenout  ( 2016-02-27 20:28:25 -0600 )edit

I still don't understand. Maybe a few pictures or screenshots would help.

ahendrix gravatar image ahendrix  ( 2016-02-27 23:38:37 -0600 )edit

can I use stereo Hand-Held Mapping of Rtabmap use to pointcloud2(stereo_camera/points2) from stereo_image_poc to make map ? if it not use what should I calibrate disparity map? (voxel_cloud)

Greenout gravatar image Greenout  ( 2016-02-28 08:00:52 -0600 )edit

2 Answers

Sort by ยป oldest newest most voted
0

answered 2016-02-29 15:51:50 -0600

matlabbe gravatar image

updated 2016-03-13 18:12:16 -0600

Hi,

rtabmap regenerates the disparity image for visualization, so ignoring the parameters you set for stereo_image_proc. If you want to use the disparity image generated from stereo_image_proc, I recommend to do something similar to Stereo A setup by generating a depth image from the disparity image (with rtabmap_ros/disparity_to_depth nodelet) and then use rgbd_mapping.launch instead of stereo_mapping.launch.

<group ns="/stereo_camera" >
    <node pkg="stereo_image_proc" type="stereo_image_proc" name="stereo_image_proc"/>
    <node pkg="nodelet" type="nodelet" name="disparity2depth" args="standalone rtabmap_ros/disparity_to_depth"/>
</group>

Launch:

$ roslaunch rtabmap_ros rgbd_mapping.launch rgb_topic:=/stereo_camera/left/image_rect_color depth_registered_topic:=/stereo_camera/depth camera_info_topic:=/stereo_camera/left/camera_info rtabmap_args:="--delete_db_on_start"

Note that the next rtabmap version will have mostly the same parameters as stereo_image_proc to tune the regenerated disparity image.

edit flag offensive delete link more

Comments

Can I use other Odometry ? in Stereo A setup except (viso2_ros)

Greenout gravatar image Greenout  ( 2016-02-29 23:47:53 -0600 )edit

Yes, you can. You will then have something similar to Kinect+Odometry setup.

matlabbe gravatar image matlabbe  ( 2016-03-01 06:56:42 -0600 )edit

Check my code, please? Under comment

Greenout gravatar image Greenout  ( 2016-03-13 16:09:24 -0600 )edit

Can I use input from video file for mapping ? because my computer spec is low ,It is slower than real time

Greenout gravatar image Greenout  ( 2016-03-27 03:44:41 -0600 )edit

Well, the best way to do that is to record a rosbag and replay it at a lower rate.

matlabbe gravatar image matlabbe  ( 2016-03-29 09:32:45 -0600 )edit

can I use image render?

Greenout gravatar image Greenout  ( 2016-03-29 14:38:33 -0600 )edit
0

answered 2016-03-13 15:56:47 -0600

Greenout gravatar image

updated 2016-03-16 13:25:31 -0600

Check my result , please?

image description

roslaunch rtabmap_ros rgbd_mapping.launch rgb_topic:=/stereo_camera/left/image_rect_color depth_registered_topic:=/stereo_camera/depth camera_info_topic:=/stereo_camera/left/camera_info frame_id:=base_link rtabmap_args:="--delete_db_on_start" estimation:=1 compressed:=true

Use 2 digital camera for stereo camera

My_stereo_camera.launch

<launch>
<group ns="/stereo_camera" >  
        <node name="left" pkg="usb_cam" type="usb_cam_node">
        <param name="video_device" value="/dev/video1" />
        <param name="image_width" value="640" />
        <param name="image_height" value="480" />
        <param name="camera_info_url" type="string" value="file:///home/greenout/catkin_ws/left.yaml" />
        <param name="framerate" value="60" />
        <param name="camera_frame_id" value="left_camera" />
    </node>
 <node name="right" pkg="usb_cam" type="usb_cam_node">
        <param name="video_device" value="/dev/video0" />
        <param name="image_width" value="640" />
        <param name="image_height" value="480" /> 
        <param name="camera_info_url" type="string" value="file:///home/greenout/catkin_ws/right.yaml" />
        <param name="framerate" value="60" />
        <param name="camera_frame_id" value="right_camera" />
    </node>
</group>
  <!-- Rotate the camera frame. -->
  <arg name="pi/2" value="1.5707963267948966" />
  <arg name="optical_rotate" value="0 0 0 -$(arg pi/2) 0 -$(arg pi/2)" />
  <node pkg="tf" type="static_transform_publisher" name="camera_base_link"

        args="0 0 0 0 0 0 left_camera right_camera 100" />  
  <node pkg="tf" type="static_transform_publisher" name="camera_base_link2"
        args="0 0 0 0 0 0 stereo_camera left_camera 100" />  
  <node pkg="tf" type="static_transform_publisher" name="camera_base_link3"
        args="$(arg optical_rotate) base_link stereo_camera 100" />

</launch>

stereo_image_proc for image rectification and disparity computation

<launch>
  <!-- Run the ROS package stereo_image_proc for image rectification and disparity computation -->
<group ns="/stereo_camera" >
    <node   pkg="stereo_image_proc" type="stereo_image_proc" name="stereo_image_proc" required="true" output = "screen">
                    <param name="approximate_sync"           value="true"/>
    </node>
    <node pkg="nodelet" type="nodelet" name="disparity2depth" args="standalone rtabmap_ros/disparity_to_depth"/>
</group>
</launch>

rgbd_mapping.launch

<launch>
  <!-- Your RGB-D sensor should be already started with "depth_registration:=true".
        Examples:
           $ roslaunch freenect_launch freenect.launch depth_registration:=true 
           $ roslaunch openni2_launch openni2.launch depth_registration:=true -->

  <!-- Choose visualization -->
  <arg name="rviz"                    default="true" />
  <arg name="rtabmapviz"              default="false" /> 

  <!-- Corresponding config files -->
  <arg name="rtabmapviz_cfg"          default="-d $(find rtabmap_ros)/launch/config/rgbd_gui.ini" />
  <arg name="rviz_cfg"                default="-d $(find rtabmap_ros)/launch/config/rgbd.rviz" />

  <arg name="frame_id"                default="camera_link"/>   <!-- Fixed frame id, you may set "base_link" or "base_footprint" if they are published -->
  <arg name="time_threshold"          default="0"/>             <!-- (ms) If not 0 ms, memory management is used to keep processing time on this fixed limit. -->
  <arg name="optimize_from_last_node" default="false"/>         <!-- Optimize the map from the last node. Should be true on multi-session mapping and when time threshold is set -->
  <arg name="database_path"           default="~/.ros/rtabmap.db"/>
  <arg name="rtabmap_args"            default=""/>              <!-- delete_db_on_start, udebug -->
  <arg name="launch_prefix"           default=""/>              <!-- for debugging purpose, it fills launch-prefix tag of the nodes -->

  <arg name="rgb_topic"               default="/camera/rgb/image_rect_color" />
  <arg name="depth_registered_topic"  default="/camera/depth_registered/image_raw" />
  <arg name="camera_info_topic"       default="/camera/rgb/camera_info" />
  <arg name="compressed"              default="false"/>
  <arg name="convert_depth_to_mm"     default="true"/>

  <arg name="subscribe_scan"          default="false"/>         <!-- Assuming 2D scan if set, rtabmap will do 3DoF mapping instead of 6DoF -->
  <arg name="scan_topic"              default="/scan"/>

  <arg name="visual_odometry"         default="true"/>          <!-- Generate visual odometry -->
  <arg name="odom_topic"              default="/odom"/>         <!-- Odometry topic used if visual_odometry is false -->

  <arg name="namespace"               default="rtabmap"/>
  <arg name="wait_for_transform"      default="0.1"/>

  <!-- Odometry parameters: -->
  <arg ...
(more)
edit flag offensive delete link more

Comments

The problem is that frame_id set for rtabmap is camera_link but this TF is not published (base_link is). Launch rgbd_mapping.launch with frame_id:=base_link to remap frame_id. Tip: you can debug TF frames with $ rosrun tf view_frames too.

matlabbe gravatar image matlabbe  ( 2016-03-13 18:19:26 -0600 )edit

I use roslaunch rtabmap_ros rgbd_mapping.launch rgb_topic:=/stereo_camera/left/image_rect_color depth_registered_topic:=/stereo_camera/depth camera_info_topic:=/stereo_camera/left/camera_info frame_id:=base_link rtabmap_args:="--delete_db_on_start" estimation:=1 compressed:=true

Greenout gravatar image Greenout  ( 2016-03-16 13:19:37 -0600 )edit

Check my result , please? upper comment

Greenout gravatar image Greenout  ( 2016-03-16 13:21:39 -0600 )edit

The screenshot looks ok. However, you don't need to use "compressed" if the camera is plugged on the same computer as rtabmap.

matlabbe gravatar image matlabbe  ( 2016-03-16 15:00:11 -0600 )edit

ok thank you ;

cloud you tell me about

 <!-- Rotate the camera frame. -->

   args="0 0 0 0 0 0 left_camera right_camera 100" />

will affect in process of mapping? Should I use the real parameter?

Greenout gravatar image Greenout  ( 2016-03-17 12:17:32 -0600 )edit

rtabmap doesn't TF between the left and right camera, so any values would not affect the mapping.

matlabbe gravatar image matlabbe  ( 2016-03-17 13:02:45 -0600 )edit

could you recommend about

<!-- Odometry parameters: -->
strategy, feature, estimation,nn,max_depth,min_inliers,inlier_distance,local_map,variance_inliers

for indoor and outdoor

Greenout gravatar image Greenout  ( 2016-03-18 08:40:18 -0600 )edit

Default parameters should be ok. estimation (0 or 1) and max_depth can be tuned depending on the sensor used.

matlabbe gravatar image matlabbe  ( 2016-03-21 14:58:31 -0600 )edit

Question Tools

Stats

Asked: 2016-02-27 17:50:55 -0600

Seen: 1,012 times

Last updated: Mar 16 '16