ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Gmapping in Gazebo with Hokuyo: problems due to max range measurements

asked 2014-03-11 04:58:19 -0500

koenlek gravatar image

updated 2016-10-24 09:10:06 -0500

ngrennan gravatar image

Hi all,

I am simulating gmapping in Gazebo with my Turtlebot. I have attached an Hokuyo laser on my (virtual) turtlebot and want to use this for gmapping instead of the virtual kinect laser.

The problem I have is that the virtual Hokuyo returns measurements at its maximum range. E.g. it would return a circle of measurment points of it would be placed in an empty world. Gmapping uses these measurements as if obstacles are detected there. This completely messes up the scan matching process, as it tries to match everything with the non-existent circular wall.

If I use the Hokuyo on my real robot, it only returns true measurements, i.e. if it cannot detect something at a certain angle, it will not return any data point (as if the distance seems infinite).

How should I adapt my virtual Hokuyo to not return data points when it has not detected anything at its max range?

Or alternatively, is there any way to adapt gmapping to cope with such measurements (which less elegant I think, as the difference between simulation and real experiments persists in other fields of measurement usage)?

Please see the screenshots attached: rviz clearly shows the 'circular non-existent' walls. I have only turned around in circles for this screenshot. If I would also move around with the robot, the scan-matching trouble becomes clear. I also have these issues (but to a lesser extend) when using it in a gazebo indoor building world (willowgarage world).

Gazebo world (with laser visualized):

image description

rviz result:

image description

Any help would be very much appreciated.

Regards, Koen

UPDATE:

A slight addition to my comment on dornheges answer: the maxRange can probably better be omitted, as skipped/missing measurements lead to the conclusion that certain space is empty, although the robot cannot actually see this. See for example the explored space outside of the room in the screenshot... Clearly, the robot couldnt know this: it is based on the assumption that no measurment equals free space.

Does anybody know a nice way to still use this 'space is free if nothing is measured' feature, without running into this glitch (i.e. make it robust against skipped/missing/erroneous measurements)?

image description

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
1

answered 2014-03-11 06:47:30 -0500

dornhege gravatar image

Your virtual Hokuyo seems misconfigured. The real Hokuyo will return a max-range measurement if it has not measured something. Ranges are never left out. That would screw up the whole scan.

Make sure that the maximum range field in the LaserScan message for the virtual Hokuyo corresponds to the maximum range that the sensor sends for max ranges. If that is the case gmapping should not map those ranges (if it does that would be a bug).

There is a parameter maxUrange that you could use in gmapping. However, that is meant for other purposes and would fix the problem at the wrong place.

edit flag offensive delete link more

Comments

Thanks, that solved it! I did have to set maxUrange however. I set the gazebo sensor to max 5.6m, maxUrange had to be slightly smaller (due to noise, otherwise you still get some black dots on the circle, I set it to 5.5m). maxRange is also useful: set it to 5.6, marking visible free space as free

koenlek gravatar image koenlek  ( 2014-03-12 00:26:51 -0500 )edit
0

answered 2017-03-06 02:05:16 -0500

My solution is use the libgazebo_ros_laser.so lib , and not use the libgazebo_ros_gpu_laser.so

change gpu_ray to ray, and change libgazebo_ros_gpu_laser.so to libgazebo_ros_laser.so in the mybot.gazebo file( find your file name and change the value in it)

Laser Description: the non-GPU version of GPU Laser, but essentially uses the same code. See GPU Laser for documentation.

To run with RRBot, open rrbot.gazebo and change the following two lines.

replace

<sensor type="gpu_ray" name="head_hokuyo_sensor">

with

<sensor type="ray" name="head_hokuyo_sensor">

and replace

  <plugin name="gazebo_ros_head_hokuyo_controller" filename="libgazebo_ros_gpu_laser.so">

with

  <plugin name="gazebo_ros_head_hokuyo_controller" filename="libgazebo_ros_laser.so">

save, then launch the same launch files as for GPU Laser.

edit flag offensive delete link more

Question Tools

2 followers

Stats

Asked: 2014-03-11 04:58:19 -0500

Seen: 3,017 times

Last updated: Mar 06 '17