# Kinect amcl

Hi everybody! I am testing amcl on a KUKA youBot equipped with a Kinect. Following this tutorial http://www.ros.org/wiki/navigation/Tutorials/Navigation%20Tuning%20Guide I move the robot with a joypad on the known map to check the amcl.

This is what I obtain: http://www.youtube.com/watch?v=Bn0Bde0BjSs

As you can see the correction for translation is quite good, te problem is that odom doesn't turn enough to in order to correct odometry in rotation.

Here is my amcl file

<launch>
<node pkg="amcl" type="amcl" name="amcl">
<!-- Publish scans from best pose at a max of 10 Hz -->
<param name="base_frame_id" value="/base_footprint"/>
<param name="odom_model_type" value="omni"/>
<param name="odom_alpha5" value="0.5"/>
<param name="transform_tolerance" value="0.2" />
<param name="gui_publish_rate" value="10.0"/>
<param name="laser_max_beams" value="60"/>
<param name="min_particles" value="1000"/>
<param name="max_particles" value="10000"/>
<!--param name="kld_err" value="0.05"/-->
<param name="kld_z" value="0.99"/>
<param name="odom_alpha1" value="0.5"/>
<param name="odom_alpha2" value="0.5"/>
<!-- translation std dev, m -->
<param name="odom_alpha3" value="0.5"/>
<param name="odom_alpha4" value="0.5"/>
<param name="laser_z_hit" value="0.95"/>
<param name="laser_z_short" value="0.15"/>
<param name="laser_z_max" value="0.03"/>
<param name="laser_z_rand" value="0.01"/>
<param name="laser_sigma_hit" value="0.002"/>
<param name="laser_lambda_short" value="0.1"/>
<param name="laser_model_type" value="likelihood_field"/>
<param name="laser_likelihood_max_dist" value="4.0"/>
<param name="update_min_d" value="0.05"/>
<param name="update_min_a" value="0.05"/>
<param name="odom_frame_id" value="/odom"/>
<param name="resample_interval" value="1"/>
<param name="recovery_alpha_slow" value="0.001"/>
<param name="recovery_alpha_fast" value="0.1"/>
<param name="initial_pose_x" value="0.35"/>
<param name="initial_pose_y" value="1.08"/>
<param name="initial_pose_a" value="1.57"/>
<param name="first_map_only" value="true"/>
<param name="laser_max_range" value="4.0"/>
<param name="laser_min_range" value="0.6"/>
</node>
</launch>


The youBot odometry is not so good, so I gave a high weight to odom parameters and I tried to trust much more the laser with a very low "laser_sigma_hit" parameter.

Any suggesion to make it works better? Thank you guys

edit retag close merge delete

Sort by » oldest newest most voted

Thanks for supplying the bag file; that always helps in trying to debug this sort of issue.

It looks like the robot is consistently underreporting the actual rotation. I watched the scans accumulate in the odom frame in rviz (as described here), and would estimate that it's underreporting by 5-10 degrees per rotation.

amcl's odometry model assumes zero-mean noise. It has a hard time correcting for systematic bias, which your robot seems to exhibit. If your robot is indeed always underreporting rotation (more data might be required to verify that), then I would try to correct the odometry before sending it to amcl. You might try a variation on the TurtleBot calibration, which uses a wall as a fixed feature to compare to. Or you might just inflate the reported yaw differences by 1-3%. Either way, if there's a consistent bias, it should be possible to do a one-time correction (not a periodic re-calibration). Such a fix could probably be incorporated into the YouBot odometry calculation somewhere.

If fixing the odometry is not feasible, I would try tuning amcl's odometry parameters. In this situation, odom_alpha1 is probably the most important one to increase, with odom_alpha4 coming next. I would leave the other parameters low (at their defaults).

I would not increase confidence on the laser model; the Kinect isn't really giving you high-quality laser scans. Instead, I would take the laser model configuration from turtlebot_navigation.

more

Thank you so much for your advice, we had the same sensation. Probably the odometry is not correct during rotation due to the changes in the payload carried by the robot.

I modified the turtlebot_calibration package in order to use it with my robot, obviously it does not solve the problem but it makes it happen later.

I expected the amcl node could handle this asset error during rotation, using the laser (for example in the same way it is done by the turtlebot_calibration ) to match the cells of the particle filter.

I would like to know if it is possible or not with amcl. Since it would not be possible I could insert periodic calibration during the robot movements.

Thank you for your attention :)

more

Lorenzo, Please use the comment feature like this to reply to posts. If you're adding more detail you can edit anything you have previously written. @Brian Gerkey explained that amcl expects a zero-mean noise characteristic which usually can be calibrated out once at startup.
( 2011-11-10 10:39:07 -0500 )edit

link I have updated a file to make easier understanding what is happening. Just extract the folder in your ros package path and run in a terminal

rospack profile
roscd amcl_test_pkg
roslaunch amcl_test.launch
rosbag play amcl_bag.bag


If somebody could have a look at this it would be very appreciated :)

Thank you

Lorenzo

more

Hi all,

I am quite new to ROS but, as Lorenzo, I am interested in understanding a bit better how the AMCL localization algorithm works. I my case have some difficulties in ensuring that odometry integral will be with zero mean (on the position estimation).

When the robot motion properties change (cause of floor properties or payload change), it is common to register some a slips in the wheels that introduce a constant error on the subsequent odometry localization. These effects are unknown and cannot be compensated ahead.

Therefore, even if there will be an approximate zero mean error on the velocity estimation (differential cinematics), for me it is nearly impossible to warrant that the drift will be removed from the odometry integration.

In similar localization problems (e.g Extended Kalman filters), the filter is aware that odometry information is only providing differential motions information (+errors) and is robust enough to remove small drifts caused by the numerical integration.

I have tried to give a look to the AMCL source code. Probably I am wrong, but it seems to me that there are two differences with the estimation approach is used in other probabilistic estimators:

• the error is propagated only on the absolute 'clean' position-odometry and not reiterated on previous position correction (therefore the algorithm assumes only small position drifts with no velocity drifts).

• The posture compensation is only refined when the robot moves, thus not using regular laser information which are available also when the robot stand still.

Finally I tried to make some changes to the code for making some debug, but in my case (ROS-Diamond version, Ubuntu 11.10) the code does not compile from the sources (I got some syntax errors).

Thank you for your help C.Alberto

more

@CAlberto this is enough of a question to stand on its own. To get it attention please ask it as a question instead of "answering" this question with a question.
( 2011-11-26 19:08:33 -0500 )edit