ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
1 | initial version |
The primary cause of this issue was a misinterpretation of how robot_localization
should be implemented; Though the package suggests it would "merge" the output of sensors, the output of amcl
isn't reliable enough as it reports coordinates on the map versus the raw coordinates returned by the sensors: amcl
reports a position of X: 20, Y: -60 on the map I've used for example, after giving it an estimated pose within Rviz
. The sensors would only keep track with the difference from the coordinates: X: 0, Y: 0, thus causing the epileptic effect of the robot jumping between those two estimates as both packages kept firm with their estimations.
As such, the original two packages for localization, being laser_scan_matcher
and amcl
were temporarily removed and replaced with two robot_localization
packages. both being fed pose and orientation packets from the LIDAR sensor (via laser_scan_matcher
configured to not publish to tf
) and the IMU sensor (An MPU9250).
Although this still hasn't proven to be the right solution due to the map moving with the robot, it did resolve the "epilepsy" caused by two largely different estimations provided by amcl
and the estimates of the packages (E.g. massive jumping between 0,0 and 20,60) by keeping only one source of estimation.
The big picture hasn't been fully resolved yet, but the lesson learned in this issue is to review the pose
messages returned by the packages and to avoid having two parties report way different estimates; that could be avoided by reviewing the tf
tree to avoid conflicts and using rostopic echo
to review the estimated positions of each package so that they match.
The setup would now replace the laser_scan_matcher
in the tf tree for robot_localization
which fuses the IMU's orientation data with the LIDAR's position estimate.