ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

gmapping make missing map when follows a straight wall

asked 2019-08-23 02:43:26 -0600

Murat gravatar image

updated 2019-08-23 02:46:01 -0600

Hi,

I have a mobile robot with Lidar sensor for mapping and localization in gazebo environment. i am using slam_gmapping for mapping.

My problem is about gmapping and its own parameters. It can create a map while my robot follow obstacle area and a unobstacle area. But when i follow a straight wall, gmapping make a missing map and my robot doesn't locate the true localization. You can look the youtube link for understanding my problem. How can i solve it?

Tanks all.

edit retag flag offensive close merge delete

2 Answers

Sort by » oldest newest most voted
2

answered 2019-08-23 03:44:44 -0600

Delb gravatar image

It's a common "issue" with gmapping, it's not really an issue because that is how the algorithm is working. In your video we can see that your robot has a limited angle of view (about 90°, from -45° to 45° apparently), so when your robot is following a straight wall at its right it doesn't detect anything else at his left (because of how far your walls are between each other).

What gmapping does for localization is comparing what the sensor is detecting with the map it is building. In your case the sensor always have the similar data because your wall is straight, that wouldn't be problematic if the robot encounter another obstacle a little further or if you could see the wall behind the robot. But your robot is getting always the same data and after a while it will consider that it's an issue, the algorithm will consider that the odometry data has been wrong (because odometry data can be very noisy, gmapping will give priority to the scan data over it) and will correct the robot position accordingly.

You have some options to avoid this :

  • If you are just using your simulation to test the mapping and don't care to change your gazebo world, you can add more obstacles that would be detected by the sensor when following the wall. But you could find the same issue if you add your obstacles symetrically and spaced at the same distance. Here's a picture to explain it better :

image description

  • You can change your SLAM algorithm to one that give more importance to the odometry, if your odometry data is good.
  • Maybe you can tune some of the gmapping parameters but I'm not sure about that.
edit flag offensive delete link more

Comments

Slam only is never the Solution. Lok at this book https://amzn.to/2L1K51G There is a Good descrition why

duck-development gravatar image duck-development  ( 2019-08-23 05:12:50 -0600 )edit

Slam only is never the Solution.

Can you detail a little bit more ? Do you meam SLAM based only on one source like odometry, or lidar, or IMU ?

Delb gravatar image Delb  ( 2019-08-23 05:28:25 -0600 )edit

I am not using IMU or another sensors. There is only a Hokuyo Lidar scanner on the my robot. But how can i change my SLAM priority for odometry or LaserScan? can you recommend a source or document?

Murat gravatar image Murat  ( 2019-08-23 05:39:09 -0600 )edit

Slam only with lidar never work in real enviroments. You need a Bunch Office Sensors to handle the real world. Laser camera , depth Kameras, ultrasonic, encoders, im, GPS, Radar, ...

duck-development gravatar image duck-development  ( 2019-08-23 05:50:19 -0600 )edit

Yeah. But i am working on simulation for now. i just wanna solve this issue in gazebo. that is my next goal for real environment and adding my others sensors. :)

Murat gravatar image Murat  ( 2019-08-23 06:17:56 -0600 )edit

If you are only in simulaiton then just changing your Gazebo world would be a solution, just add a bunch of obstacles in so that the lidar can always detect several ones in its range.

What is the type of the real envirronment that you will put your robot into ?

Delb gravatar image Delb  ( 2019-08-23 06:31:15 -0600 )edit

@duck-development Yes I agree that only one source isn't enough but you don't always require a camera AND ultrasonic sensor AND encoders etc, it's just too much. Some applications can work fine only with a lidar, imu and odometry. It just depends on how precize you want/need to be.

Delb gravatar image Delb  ( 2019-08-23 06:33:25 -0600 )edit

You can create an enviroment wehere an lidar is enought. But you create it. If you look at the xiaomi vac you get lds, imu, Encoder, extra ir Sensors, ultrasonic. and the look at every Part to Keep the Cost Down.

duck-development gravatar image duck-development  ( 2019-08-23 16:18:56 -0600 )edit
1

answered 2019-08-26 05:44:36 -0600

Murat gravatar image

finaly. i found it. just add <param name="minimumScore" value="260"/> parameter to slam_gmapping node. Because my laser range is 5m. I found this at the gmapping information section:

~minimumScore (float, default: 0.0) Minimum score for considering the outcome of the scan matching good. Can avoid jumping pose estimates in large open spaces when using laser scanners with limited range (e.g. 5m). Scores go up to 600+, try 50 for example when experiencing jumping estimate issues.

Thanks to @Delb and @duck-development

edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2019-08-23 02:43:26 -0600

Seen: 927 times

Last updated: Aug 26 '19