Ask Your Question
2

2D SLAM with gmapping and openni_kinect

asked 2011-03-16 04:29:45 -0500

tom gravatar image

updated 2014-04-20 14:05:53 -0500

I know it's possible to do 2D SLAM with Kinect using slam_gmapping, eg. because of this post. Could anyone please tell me how exactly to do that?

I've already installed pointcloud_to_laserscan, slam_gmapping, turtlebot and turtlebot_apps. But after running roslaunch turtlebot_navigation gmapping_demo.launch all I'm getting is an info saying: "Still waiting on map". What and in which order should I execute to obtain a map like in turtlebot_navigation's tutorial?


Ok, I think I partially got it. There were some errors in the default pointcloud_to_laserscan launchfile. My working version below.

<launch>
  <!-- kinect and frame ids -->
  <include file="$(find openni_camera)/launch/openni_node.launch"/>

  <!-- Kinect -->
  <node pkg="nodelet" type="nodelet" name="openni_manager" output="screen" respawn="true" args="manager"/>

  <!-- fake laser -->
  <node pkg="nodelet" type="nodelet" name="kinect_laser" args="load pointcloud_to_laserscan/CloudToScan openni_camera">
    <param name="output_frame_id" value="/openni_depth_frame"/>
    <remap from="cloud" to="cloud_throttled"/>
  </node>

  <!-- throttling -->
  <node pkg="nodelet" type="nodelet" name="pointcloud_throttle" args="load pointcloud_to_laserscan/CloudThrottle openni_camera">
    <param name="max_rate" value="2"/>
    <remap from="cloud_in" to="/camera/depth/points"/>
    <remap from="cloud_out" to="cloud_throttled"/>
  </node>
</launch>

When I now run rosrun gmapping slam_gmapping I get a warning saying:

[ WARN] [1300374330.893690231]: MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information.
[DEBUG] [1300374332.317853661]: MessageFilter [target=/odom ]: Removed oldest message because buffer is full, count now 5 (frame_id=/kinect_depth_frame, stamp=1300374331.992896)
[DEBUG] [1300374332.318014019]: MessageFilter [target=/odom ]: Added message in frame /kinect_depth_frame at time 1300374332.311, count now 5
[DEBUG] [1300374332.376768593]: MessageFilter [target=/odom ]: Removed oldest message because buffer is full, count now 5 (frame_id=/kinect_depth_frame, stamp=1300374332.060879)

I think the problem might be there is no tf tree between /openni_camera and /map defined - how do I achieve this?

Any help appreciated, Tom.

edit retag flag offensive close merge delete

3 answers

Sort by ยป oldest newest most voted
4

answered 2011-03-17 12:33:23 -0500

Hi Tom,

This does not answer your gmapping question, but the launch file you posted above does not work for me. It runs without error but it does not produce a "/scan" topic for the fake laser scan. In other words, run your launch file, then in a separate terminal run the command "rostopic list | grep scan". It should return "/scan" but in my case it returns nothing.

The following modified version of your launch file does work for me:

<launch>
  <!-- kinect and frame ids -->
  <include file="$(find openni_camera)/launch/openni_node.launch"/>

  <!-- openni manager -->
  <node pkg="nodelet" type="nodelet" name="openni_manager" output="screen" respawn="true" args="manager"/>

  <!-- throttling -->
  <node pkg="nodelet" type="nodelet" name="pointcloud_throttle" args="load pointcloud_to_laserscan/CloudThrottle openni_manager">
    <param name="max_rate" value="2"/>
    <remap from="cloud_in" to="/camera/depth/points"/>
    <remap from="cloud_out" to="cloud_throttled"/>
  </node>

  <!-- fake laser -->
  <node pkg="nodelet" type="nodelet" name="kinect_laser" args="load pointcloud_to_laserscan/CloudToScan openni_manager">
    <param name="output_frame_id" value="/openni_depth_frame"/>
    <remap from="cloud" to="cloud_throttled"/>
  </node>
</launch>

The important difference is that the two nodelets now wait on openni_manager rather than openni_camera.

Using this launch file, I can add a Laser Scan display in RViz and select the "scan" topic and see the scan points.

Regarding your gmapping question, you don't mention what kind of robot you are using. Is it an iRobot Create? If so, are you using an IMU with it? Also, have you run through all the Navigation tutorials starting here: http://www.ros.org/wiki/navigation/Tutorials?

--patrick

edit flag offensive delete link more

Comments

Thanks Patrick. I'm not using any robot, I just have a Kinect sensor. Is it not enough to create a map of a non-changing environment? For the beginning it would suffice to obtain a map by just rotating the sensor in the yaw axis assuming no pitch or roll or translation. Is it not possible? tom ( 2011-03-17 21:37:09 -0500 )edit
1
Just a Kinect is not enough -- gmapping requires an odometry tf transformation. You might look at using the canonical_scan_matcher to estimate an odometry transformation -- although I somewhat doubt that the Kinect data will work with it (due to the lack of features, because of the narrow fov). fergs ( 2011-03-18 01:49:03 -0500 )edit
Ok, I'll go through the tutorial above, probably everything becomes clear then. Thanks all. tom ( 2011-03-18 04:37:20 -0500 )edit
Yeah, the navigation stuff has a lot of parts but is very nicely laid out in the tutorials. Then, to build a simple robot to do all the cool gmapping stuff, you could use an iRobot Create or something home grown based on the ArbotiX or Serializer controllers, to name a couple. Pi Robot ( 2011-03-18 04:48:31 -0500 )edit
I'd like to use SLAM to navigate a quadrotor UAV in an indoor env and create a floor plan. I'd like to do it with an IMU and a Kinect. However an IMU on a VTOL w/o ext aiding drifts in yaw direction. So I guess a scan matcher must be used to provide yaw correction. Do I stand a chance / any hints? tom ( 2011-03-20 21:37:09 -0500 )edit
1

answered 2011-03-16 08:56:56 -0500

You need to make sure that you are providing all the inputs which gmapping is expecting before it will start operating. See it's ROS API on the wiki

You can also use a tool like rxgraph to make sure that it's all hooked up correctly. Note that the turtlebot launchs do some remapping.

edit flag offensive delete link more
0

answered 2012-07-14 04:29:31 -0500

Is it possible that we simulate the data provided by the sensor on the bottom?So that we can demo the slam only with the kinect??Thank you very much!

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

[hide preview]

Question Tools

Follow
7 followers

Stats

Asked: 2011-03-16 04:29:45 -0500

Seen: 2,662 times

Last updated: Jul 14 '12