ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

3D map with RTAB, SICK, IMU and Husky A200

asked 2016-08-23 18:20:24 -0600

Marcus Barnet gravatar image

updated 2016-08-23 19:04:36 -0600

I'm continuing my work on 3D map building as specified in my old topic.

Finally, I mounted my sensors (IMU, SICK and RealSense R200) on a mobile robot (Husky A200) an I'm doing some acquisitions outdoor; in particular, the IMU is mounted on the center of the robot frame, the SICK laser is mounted 10 centimeters at the top of the IMU sensor and the RealSense is mounted 40 centimeters on the top of the laser sensor.

Now, I have the /odometry/filtered topic which is a combination of wheels encoders and IMU data. I added the laser information in this way:

  <arg name="subscribe_scan"          default="true"/>
  <arg name="scan_topic"              default="/scan"/>

inside the rgbd_mapping.launch file and I run the RTAB algorithm with these commands:

$ roscore
$ rosparam set use_sim_time true
$ roslaunch rtabmap_ros rgbd_mapping.launch rgb_topic:=/camera/color/image_raw depth_registered_topic:=/camera/depth/image_raw camera_info_topic:=/camera/color/camera_info compressed:=true frame_id:=base_link rtabmap_args:="--delete_db_on_start" estimation:=1
$ rosbag play --clock test1.bag (the bag contains both realsense and laser data)

How can I add the filtered odometry which contains also the IMU data? Is there anything which I have to change/add in the rgbd_mapping.launch file?

Moreover, I tried to check the tf and frames for all sensors and I cannot see the laser frame. Moreover, the real sense frame is not a child of base_link.

How can I fix this frame problem? This is the frames.pdf file. The frame tree is suitable for the RTAB algorithm?

What happens if the robot has to face terrain disparity? The /odometry/filtered based on IMU sensor is able to compensate the disparity by adjusting the laser and the real sense data according with the tilt angle?

This is my rgbd_mapping.launch file:

<launch>
  <!-- Your RGB-D sensor should be already started with "depth_registration:=true".
        Examples:
           $ roslaunch freenect_launch freenect.launch depth_registration:=true 
           $ roslaunch openni2_launch openni2.launch depth_registration:=true -->

  <!-- Choose visualization -->
  <arg name="rviz"                    default="false" />
  <arg name="rtabmapviz"              default="true" /> 

  <!-- Corresponding config files -->
  <arg name="rtabmapviz_cfg"          default="-d $(find rtabmap_ros)/launch/config/rgbd_gui.ini" />
  <arg name="rviz_cfg"                default="-d $(find rtabmap_ros)/launch/config/rgbd.rviz" />

  <arg name="frame_id"                default="camera_link"/>   <!-- Fixed frame id, you may set "base_link" or "base_footprint" if they are published -->
  <arg name="time_threshold"          default="0"/>             <!-- (ms) If not 0 ms, memory management is used to keep processing time on this fixed limit. -->
  <arg name="optimize_from_last_node" default="false"/>         <!-- Optimize the map from the last node. Should be true on multi-session mapping and when time threshold is set -->
  <arg name="database_path"           default="~/.ros/rtabmap.db"/>
  <arg name="rtabmap_args"            default=""/>              <!-- delete_db_on_start, udebug -->
  <arg name="launch_prefix"           default=""/>              <!-- for debugging purpose, it fills launch-prefix tag of the nodes -->

  <arg name="rgb_topic"               default="/camera/rgb/image_rect_color" />
  <arg name="depth_registered_topic"  default="/camera/depth_registered/image_raw" />
  <arg name="camera_info_topic"       default="/camera/rgb/camera_info" />
  <arg name="compressed"              default="false"/>
  <arg name="convert_depth_to_mm"     default="true"/>

  <arg name="subscribe_scan"          default="true"/>         <!-- Assuming 2D scan if set, rtabmap will do 3DoF mapping instead of 6DoF -->
  <arg name="scan_topic"              default="/scan"/>

  <arg name ...
(more)
edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
1

answered 2016-08-23 19:39:53 -0600

mjedmonds gravatar image

updated 2016-08-23 19:40:18 -0600

First, you need to add a calibrated tf between your base_link and your realsense_frame. You can manually measure this and verify it. The robot needs to know where the camera is relative to its other parts.

You may also want to look at the robot_localization package to fuse your rtabmap odometry with the IMU and wheel odometry.

edit flag offensive delete link more

Comments

rosrun tf static_transform_publisher 0 0 0.50 0 0 0 base_link realsense_frame 100

rosrun tf static_transform_publisher 0 0 0.1 0 0 0 base_link laser 100

Do you think it should be correct?

/odometry/filtered is the output of robot_localization, how can I use it as input for RTAB?

Marcus Barnet gravatar image Marcus Barnet  ( 2016-08-24 02:35:59 -0600 )edit

/odometry/filtered already contains the wheel odometry fused with the IMU data. I need to use this as input for RTAB, but I don't know how to set this inside rgbd_mapping.launch file.

Marcus Barnet gravatar image Marcus Barnet  ( 2016-08-24 02:38:52 -0600 )edit
1

Using /odometry/filtered as odometry:

<arg name="visual_odometry" default="false"/> <!-- Generate visual odometry -->
<arg name="odom_topic" default="/odometry/filtered"/> <!-- Odometry topic used if visual_odometry is false -->
matlabbe gravatar image matlabbe  ( 2016-08-24 07:44:06 -0600 )edit

Do you think the /odometry/filtered with IMU will improve the rtab algorithm by adding compensation for terrain disparity? I mean for example if the robot climbs or tilt on agricultural terrain.

Marcus Barnet gravatar image Marcus Barnet  ( 2016-08-24 08:16:13 -0600 )edit
1

Not sure what do you mean by terrain disparity, but with IMU you would have 6DoF odometry, instead of 3DoF if using only wheel encoders.

Note also that if you are on a 3D terrain, don't set subscribe_scan to true, as rtabmap assumes you are on a 2d terrain (indoor) if it is true.

matlabbe gravatar image matlabbe  ( 2016-08-24 08:36:49 -0600 )edit

On agricultural terrain, there can be ditches that will cause the robot and the camera to bottom out, and slopes that will tip the robot and the camera over. With 6FoF, the RTAB can compensate this problems by building a more accurate 3D map?

Marcus Barnet gravatar image Marcus Barnet  ( 2016-08-24 09:19:15 -0600 )edit

Moreover, if I'm on a 3D terrain, can't I use /scan topic, then? I noticed that using the laser, the map is improved with more features. Is there a way to add laser information on 3D terrain? Can't I compensate the scan information by adding the IMU data to achieve a 3D laser visualization?

Marcus Barnet gravatar image Marcus Barnet  ( 2016-08-24 09:20:53 -0600 )edit
1

It seems more appropriate to have bottom-out and tipping avoidance be part of your navigation planner, but I'm not an expert on the topic.

However, the way I use robot_localization, I fuse the IMU with the wheel odometry and visual odometry from rtabmap. See the robot_localization package

mjedmonds gravatar image mjedmonds  ( 2016-08-24 11:25:34 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2016-08-23 18:20:24 -0600

Seen: 830 times

Last updated: Aug 23 '16