AMCL Issues with the kidnapped robot problem (tuning needed?)
Hi,
I've been experimenting with AMCL for solving the kidnapped robot problem. I don't really understand why, but it solves quite well in the "y" direction (horizontal axis in the video below), but always gets it wrong in the "x" direction (vertical axis in the video below). It doesn't seem like there is any weird symmetry that is messing up with things, so I'm a bit at a loss.
If I initialize the algorithm with the correct pose and low covariance, then it works well. However, it locks to a bad solution if I increase the initial covariance dramatically.
The video below shows an example of my issue. I expected AMCL to get a solution much closer to the base_link
transform, but not as far as it is being solved to (seen in the tf tree). The red dots in the video represents the lidar measurements as projected from the moving frame in the tf tree - the robot's base_link
frame (I'm sorry that it isn't very visible in the video, but we can't increase the font size for that in Rviz). If the lidar measurements were projected from the AMCL solution, then we would see the lidar measurements shifted by -1.5m
in the x
direction (1.5m
is the rough distance between the estimated base_link
frame of the robot and AMCL's solution). To be clear, the robot is moving from right to left in the map, ending close to the origin of the map frame.
For more clarity on what is going on, you can see the video below which shows a gazebo simulation of the robot (a drone) flying horizontally and capturing the lidar data (in blue) that was used in the video above. Notie that because I was using simulation, then I have ground truth to know that AMCL is locking to a wrong solution:
You can see all my launch files and run the very same example by downloading the files from the following folder: https://drive.google.com/drive/folder...
To launch the example above, just place all files in the same folder and run the command (make sure that you have the amcl
and map_server
packages installed. I am using the 1.17.1
version of the navigation
stack on ROS Noetic, Ubuntu 20.04):
LAUNCH_DIR=`pwd` roslaunch launch_amcl.launch
Does anyone with experience with AMCL have a good idea of what is wrong with my usage? Any ideas on how to tune the parameters in a better way? Any good explanation as to why I am always getting bad solutions like this? I have also been using the Probabilistic Robotics book as a reference, and I feel like I understand the solution. Still, I don't fully understand why the solution keeps diverging to what seems to be a different local minimum
Different things I tried:
If I set the initial XY position to
[0.0 ...
Please edit your description to explain what we are seeing in the video. The red line is the lidar scan? Is the robot physically moving or is it stationary?
You edit using the "edit" button at the end of your description.
Thank you for the suggestions. I have (hopefully) improved my description, and added extra videos that can hopefully make it all clearer =)
Please add your amcl configuration to the description (rather than using an external link.) It's OK if it is many lines. Format it using 101010 button.
added to the description
Overall, your amcl config looks pretty reasonable so I have only small tweaks to suggest.
odom_alpha3
is so different from the others.odom_alphaX
? I haven't used theomni
model, but I think I read that 0.2 was good only fordiff
odom_model, and thatomni
wanted significantly smaller values? Try dividing by 10.laser_z_hit
to 0.8, and make corresponding decrease inlaser_z_rand
.