ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | Q&A
Ask Your Question

Why Hector-Slam works better than Gmapping with RPLiDAR?

asked 2019-06-27 14:17:26 -0500

Perrary gravatar image

updated 2019-09-23 14:42:20 -0500

Hi. I am working with a home-made differential robot that have a RPLiDAR sensor for SLAM. Initially it worked with Hector-Slam a i had quiet good results, but now i am trying to improve them with Gmapping. I follow this turorial for setting up the parameters of Gmapping. Then i record a bag and when a i make my map with Gmapping i get a worst result than the one provided by hector. The Hector's result is this one Hector, and Gmapping

I tryed to follow the solution propose in this question but it didn't work.

I think that my problem is related with my frames despite i follow the specifications explained here.My tf tree is this one.

I am trying to upload the maps generated by both algorithms but i cant because i don't have five points, and i don't know how to earn them.

Thanks for your help


My new tf tree is the next one image description

And the new map made by gmapping its the next one. image description

Hector slam still have a better look.


The commands that i use in my projects are the next ones:

//Terminal 1

//Terminal 2
export ROS_IP=

//Terminal 3 - This one runs the RpLidar node
ssh -X blasrobot@ sino la
export ROS_IP=
ls -l /dev |grep ttyUSB
sudo chmod 666 /dev/ttyUSB0
cd catkin_ws/
source devel/setup.bash
roslaunch rplidar_ros rplidar.launch

//Terminal 4  - This one gives me back the static transforms and the odometry data as the odom->base_link transform
ssh -X blasrobot@
export ROS_IP=
sudo chmod a+rw /dev/ttyACM0
rosrun rosserial_python _port:=/dev/ttyACM0 _baud:=57600

//Terminal 5 - This one sends speed to the robot
ssh -X blasrobot@
export ROS_IP=
rostopic pub -r 10 /cmd_vel geometry_msgs/Twist '[0.0, 0, 0]' '[0, 0, 0]'

//Terminal 6 - This one runs the gmapping node
cd catkin_ws/
source devel/setup.bash
rosrun gmapping slam_gmapping scan:=scan


After some debugging i realized that my problem is with my odometry. I run the two sanity check provided here and i get the next results:

Straight Foward: image description

Rotation: image description

After watching this it is obvious that my problem came from my odometry while rotations. I checked the model and the parameters are okey. Any idea?

edit retag flag offensive close merge delete


Can you provide some info how you get the odometry from the robot?

Ktysai gravatar image Ktysai  ( 2019-08-12 18:41:14 -0500 )edit

Hi Ktysai. Thanks for answering. Sorry for the delay, but i was abroad. To get my odometry i am using two encoders. With each tick of each encoder I generate an interruption and using timers i count how many counts of the timer i have (with the TCNTx register). with this counts i calculate the speed as: Input2 = 2 * PI / 1120 / paso2 / 5 * 10000000 (where paso2 is the number of counts and " /5 * 10000000 " is because i set the timers to have a count each 500ns).

Once i calculate the speed of each motor, i use the following relationships:

vx = R / 2 * (Input1 + Input2) * cos(th);

vy = R / 2 * (Input1 + Input2) * sin(th);

vth = R / L * (Input1 - Input2);

//Calculo posicion

x = x + vx * deltat;

y = y + vy * deltat;

th = th + vth * deltat;

Where deltat is the loop time.

Perrary gravatar image Perrary  ( 2019-08-27 11:13:15 -0500 )edit

Have you tried vth = R/L * (Input2 - Input1);? Depending on how you name your joints you can have left and right encoder inverted. In my robot I do vth = R/L * (input_right - input_left);. When traveling on a straight line the influence of this inversion is not too big, but it is obvious when spinning.

Martin Peris gravatar image Martin Peris  ( 2019-09-23 21:44:41 -0500 )edit

Martin, finally i improve my odometry while spinning changing the way i calculate the angle. I just calculate it from encoder's ticks.

Perrary gravatar image Perrary  ( 2019-09-24 10:06:00 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2019-06-27 20:20:43 -0500

Hi Perrary,

I upvoted your question to give you some karma so you can upload pictures. Looking at your tf tree it seems to be incorrect. The laser should not be detached from the tree. TF trees should have only one root.

So the problem is definitely with your frames definitions. The usual tf tree would look like this:

map -> odom -> base_footprint -> base_link -> laser

The trasnform between map and odom is provided by gmapping, between odom and base_footprint (usually base_link is the midpoint between the two wheels on a differential drive robot and base_footprint is the projection of base_link onto the floor) would be provided by your odometry calculation node. Transforms between base_footprint, base_link and laser are usually static so they are provided by a static transform publisher.

If you provide us with your launch files we might be able to pinpoint where your problem is.

edit flag offensive delete link more


Hi Martin. Thanks for the explanation, now i understand what is the base_footprint frame (I had seen it many times but I did not know it was). I know that my transform tree must have only one root, but my problem is with the laser frame, is where the LiDAR publish his information.It is always static?. Reading your answer gave me some clues about fixing my problem.

I am sorry, but i am not working with launchfiles, i'm newer in ROS. I rosrun every node by itself. I'm going to change my tf tree and see what happens. If the problem is solved i will tell you.


Perrary gravatar image Perrary  ( 2019-06-28 09:07:30 -0500 )edit

Martin. The frame between odom and map it should't be static? Because mi odom frame is the place where my robot start to move, and my map frame is the place where i refere my map. Thanks. I had edit the original question.

Perrary gravatar image Perrary  ( 2019-06-28 13:00:45 -0500 )edit

No, the transform between odom and map is not static. Although at the time of starting everything both frames are the same, odom will drift inevitably over time and will become different (much different potentially) than map frame. Actually, both gmapping and hectormapping are correcting for odometry drift using the laser readings (which is known as SLAM) and publishing the transformation between the map and odometry.

Martin Peris gravatar image Martin Peris  ( 2019-07-01 03:26:48 -0500 )edit

How Hector mapping correct the odometry drift if it doesn't use the odometry data? While my map is being done i see in rviz that the odom frame changes it position a lot, fot the things that you explained me i think that this change is induced by the map->odom transform, it should't change gradually?

Perrary gravatar image Perrary  ( 2019-07-01 09:00:36 -0500 )edit

From the hector_mapping wiki: -> Section 3.1.6 Provided tf Transforms. We could be more helpful if you updated your question with the specific commands that you are using to start your environment

Martin Peris gravatar image Martin Peris  ( 2019-07-01 20:07:14 -0500 )edit

Okey Martin, i will update it with my commands. After a couple of test i think that my problem come from the tf tree, specially with the base_link->laser. EDIT I am an ... i forgot the node that apply the transformation

Perrary gravatar image Perrary  ( 2019-07-02 11:42:47 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools



Asked: 2019-06-27 14:17:26 -0500

Seen: 1,680 times

Last updated: Sep 23 '19