ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Implementing GMapping with Two Lidars

asked 2015-06-24 14:43:30 -0500

CMobley7 gravatar image

So, I am attempting to autonomously navigate a Clearpath Robotics' Husky using GMapping. The Husky is currently equipped with two Sick LMS151 lidars, one centered on both the left and right side. Presently, the laser data from the two Sick LMS151 lidars is being merged together using the ira_laser_merger package available at https://github.com/iralabdisco/ira_la... . Unfortunately, while this package does merge the laser data, it does so in a way that creates false data at different points around the Husky. As an example, when I stand in front of the Husky and move backwards, it shows this. However, it also shows me moving toward the back of the Husky. Is there a way to fix this problem using the packages I am currently using or is there a better method I can implement for using two lidars with GMapping?

edit retag flag offensive close merge delete

Comments

hi, have this solved?

asimay_y gravatar image asimay_y  ( 2016-02-02 20:58:47 -0500 )edit

i am facing same issue . i already use this methods but did not get proper solution map . is there any other package available . or method

akshaywifi gravatar image akshaywifi  ( 2020-07-20 12:18:47 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
0

answered 2016-02-03 09:55:50 -0500

CMobley7 gravatar image

First, I suggest ensuring that the LIDARS are accurately centered on both the robot and in the URDF. While the URDF showed our LIDARS were centered, they were slightly off on the actually robot. Due to how we affixed the LIDARS to the robot, it was easier for use to fine-tune the placement of the LIDARS sensors in the URDF. The LaserScan data from each LIDAR was shown in RVIZ, each in a different color. We performed multiple iterations of placing an object at different distances and slightly modifying each LIDARS position in the URDF until they agreed as to the placement of the object. However, this could easily be done the other way around, slightly adjusting the placement of the LIDAR sensor on the robot until they agree. This was successful in removing most of the false data. So, depending on your application this may be the optimal solution. In addition, I suggest using many of the filters in laser_filter to filter the raw LaserScan data to ensure your are giving quality data to GMapping. Another option would be to use the laser_filter package to limit the range of the Sick LMS151 sensor to 180 degrees and add another sensor for front-facing FOV. This last option was what we choose to implement, due to the fact that the LIDARs would not have more than 180 degree FOV in our final application.

edit flag offensive delete link more

Question Tools

4 followers

Stats

Asked: 2015-06-24 14:43:30 -0500

Seen: 2,032 times

Last updated: Jul 20 '20