Why Hector-Slam works better than Gmapping with RPLiDAR?
Hi. I am working with a home-made differential robot that have a RPLiDAR sensor for SLAM. Initially it worked with Hector-Slam a i had quiet good results, but now i am trying to improve them with Gmapping. I follow this turorial for setting up the parameters of Gmapping. Then i record a bag and when a i make my map with Gmapping i get a worst result than the one provided by hector. The Hector's result is this one Hector, and Gmapping
I tryed to follow the solution propose in this question but it didn't work.
I think that my problem is related with my frames despite i follow the specifications explained here.My tf tree is this one.
I am trying to upload the maps generated by both algorithms but i cant because i don't have five points, and i don't know how to earn them.
Thanks for your help
EDIT
My new tf tree is the next one
And the new map made by gmapping its the next one.
Hector slam still have a better look.
EDIT 2
The commands that i use in my projects are the next ones:
//Terminal 1
roscore
//Terminal 2
export ROS_IP=163.10.76.106
//Terminal 3 - This one runs the RpLidar node
ssh -X blasrobot@10.0.12.237//probar sino la 10.0.12.237
export ROS_MASTER_URI=http://163.10.76.106:11311
export ROS_IP=10.0.12.237
ls -l /dev |grep ttyUSB
sudo chmod 666 /dev/ttyUSB0
cd catkin_ws/
source devel/setup.bash
roslaunch rplidar_ros rplidar.launch
//Terminal 4 - This one gives me back the static transforms and the odometry data as the odom->base_link transform
ssh -X blasrobot@10.0.12.237
export ROS_MASTER_URI=http://163.10.76.106:11311
export ROS_IP=10.0.12.237
sudo chmod a+rw /dev/ttyACM0
rosrun rosserial_python serial_node.py _port:=/dev/ttyACM0 _baud:=57600
//Terminal 5 - This one sends speed to the robot
ssh -X blasrobot@10.0.12.237
export ROS_MASTER_URI=http://163.10.76.106:11311
export ROS_IP=10.0.12.237
rostopic pub -r 10 /cmd_vel geometry_msgs/Twist '[0.0, 0, 0]' '[0, 0, 0]'
//Terminal 6 - This one runs the gmapping node
cd catkin_ws/
source devel/setup.bash
rosrun gmapping slam_gmapping scan:=scan
EDIT 2
After some debugging i realized that my problem is with my odometry. I run the two sanity check provided here and i get the next results:
Straight Foward:
Rotation:
After watching this it is obvious that my problem came from my odometry while rotations. I checked the model and the parameters are okey. Any idea?
Asked by Perrary on 2019-06-27 14:17:26 UTC
Answers
Hi Perrary,
I upvoted your question to give you some karma so you can upload pictures. Looking at your tf tree it seems to be incorrect. The laser should not be detached from the tree. TF trees should have only one root.
So the problem is definitely with your frames definitions. The usual tf tree would look like this:
map -> odom -> base_footprint -> base_link -> laser
The trasnform between map and odom is provided by gmapping, between odom and base_footprint (usually base_link is the midpoint between the two wheels on a differential drive robot and base_footprint is the projection of base_link onto the floor) would be provided by your odometry calculation node. Transforms between base_footprint, base_link and laser are usually static so they are provided by a static transform publisher.
If you provide us with your launch files we might be able to pinpoint where your problem is.
Asked by Martin Peris on 2019-06-27 20:20:43 UTC
Comments
Hi Martin. Thanks for the explanation, now i understand what is the base_footprint frame (I had seen it many times but I did not know it was). I know that my transform tree must have only one root, but my problem is with the laser frame, is where the LiDAR publish his information.It is always static?. Reading your answer gave me some clues about fixing my problem.
I am sorry, but i am not working with launchfiles, i'm newer in ROS. I rosrun every node by itself. I'm going to change my tf tree and see what happens. If the problem is solved i will tell you.
Thanks.
Asked by Perrary on 2019-06-28 09:07:30 UTC
Martin. The frame between odom and map it should't be static? Because mi odom frame is the place where my robot start to move, and my map frame is the place where i refere my map. Thanks. I had edit the original question.
Asked by Perrary on 2019-06-28 13:00:45 UTC
No, the transform between odom and map is not static. Although at the time of starting everything both frames are the same, odom will drift inevitably over time and will become different (much different potentially) than map frame. Actually, both gmapping and hectormapping are correcting for odometry drift using the laser readings (which is known as SLAM) and publishing the transformation between the map and odometry.
Asked by Martin Peris on 2019-07-01 03:26:48 UTC
How Hector mapping correct the odometry drift if it doesn't use the odometry data? While my map is being done i see in rviz that the odom frame changes it position a lot, fot the things that you explained me i think that this change is induced by the map->odom transform, it should't change gradually?
Asked by Perrary on 2019-07-01 09:00:36 UTC
From the hector_mapping wiki: http://wiki.ros.org/hector_mapping -> Section 3.1.6 Provided tf Transforms. We could be more helpful if you updated your question with the specific commands that you are using to start your environment
Asked by Martin Peris on 2019-07-01 20:07:14 UTC
Okey Martin, i will update it with my commands. After a couple of test i think that my problem come from the tf tree, specially with the base_link->laser. EDIT I am an ... i forgot the node that apply the transformation
Asked by Perrary on 2019-07-02 11:42:47 UTC
Comments
Can you provide some info how you get the odometry from the robot?
Asked by Ktysai on 2019-08-12 18:41:14 UTC
Hi Ktysai. Thanks for answering. Sorry for the delay, but i was abroad. To get my odometry i am using two encoders. With each tick of each encoder I generate an interruption and using timers i count how many counts of the timer i have (with the TCNTx register). with this counts i calculate the speed as: Input2 = 2 * PI / 1120 / paso2 / 5 * 10000000 (where paso2 is the number of counts and " /5 * 10000000 " is because i set the timers to have a count each 500ns).
Once i calculate the speed of each motor, i use the following relationships:
vx = R / 2 * (Input1 + Input2) * cos(th);
vy = R / 2 * (Input1 + Input2) * sin(th);
vth = R / L * (Input1 - Input2);
//Calculo posicion
x = x + vx * deltat;
y = y + vy * deltat;
th = th + vth * deltat;
Where deltat is the loop time.
Asked by Perrary on 2019-08-27 11:13:15 UTC
Have you tried vth = R/L * (Input2 - Input1);? Depending on how you name your joints you can have left and right encoder inverted. In my robot I do vth = R/L * (input_right - input_left);. When traveling on a straight line the influence of this inversion is not too big, but it is obvious when spinning.
Asked by Martin Peris on 2019-09-23 21:44:41 UTC
Martin, finally i improve my odometry while spinning changing the way i calculate the angle. I just calculate it from encoder's ticks.
Asked by Perrary on 2019-09-24 10:06:00 UTC