Hector SLAM with odometry switch
I've spent a good amount of time trying to find a solution to the Hector SLAM with long corridors localisation problem(if you found this through a Google search you know exactly what I'm talking about). Instead of doing sensor fusion with the odometry of my robot I’ve decided to do a switch as soon as parameters in the scan matching covariance in Hector SLAM start showing erratic behaviour. The variable is called lastScanMatchCov. In my testing I found that the middle value of the covariance matrix spikes in long corridor scenarios, the spike occurs early enough for the odometry “switch” to trigger.
The plot shows the middle element in different scenarios, N are the non-long-corridor environments while C are the long-corridor environments. As seen in the plot all the long-corridor environments yield a great spike, this is used as the trigger to use odometry.
When the switch triggers I want to use the odometry pose estimate instead of the current poseHintWorld that the laser scan matching algorithm uses. I know that the odometry is being published as /odom topic, and rostopic echoing this topic gives me updates on the position. My problem is how do I get the odometry position estimate? I’m fairly new to ROS and I’ve finished the subscriber/publisher tutorial, but still can’t seem to get a hole through. Here are the code parts I want to edit in HectorSlamProcessor.h
bool update(const DataContainer& dataContainer, const Eigen::Vector3f& poseHintWorld, bool map_without_matching = false, bool force_update = false)
{
bool ret = false;
Eigen::Vector3f newPoseEstimateWorld;
if (!map_without_matching){
//middle value of 3x3 covariance matrix
if (lastScanMatchCov(1,1) > 100.0)
{
std::cout <<"\nunsure!\n";
//USE DIFFERENT POSE ESTIMATE
//poseHintWorld = "position estimate from /odom topic"
//newPoseEstimateWorld = poseHintWorld
}
newPoseEstimateWorld = (mapRep->matchData(poseHintWorld, dataContainer, lastScanMatchCov));
}else{
newPoseEstimateWorld = poseHintWorld;
}
lastScanMatchPose = newPoseEstimateWorld;
//std::cout << "\nt1:\n" << newPoseEstimateWorld << "\n";
//std::cout << "\n1";
//std::cout << "\n" << lastScanMatchPose << "\n";
if(util::poseDifferenceLargerThan(newPoseEstimateWorld, lastMapUpdatePose, paramMinDistanceDiffForMapUpdate, paramMinAngleDiffForMapUpdate) || map_without_matching || force_update){
ret = true;
}
Hope you guys can help me out.
Edit: I should mention that the map is generated in a very harsh environment for Hector SLAM.
- The corridors are way longer than the range of the scanner, eliminating the possibility of the scanners seeing the wall at the end of the corridor.
- The map is drawn such that the walls along the corridor have absolutely 0 features. This is done because the robots have a really good odometry estimate by combining IMU and odometry.
- The solution must be a variant of Hector SLAM, this is not my decision, these are just my instructions :)
Update I'm now able to extract the position data of the robot with the help of a subscriber
void HectorMappingRos::odomCallback(const nav_msgs::Odometry::ConstPtr& msg)
{
Eigen::Vector3f odometry_pose;
odometry_pose(0) = msg->pose.pose.position.x;
odometry_pose(1) = msg->pose.pose.position.y;
odometry_pose(2) = msg->pose.pose.orientation.z;
std::cout << "\nodometry_pose_vector\n" << odometry_pose << "\n";
}
But how do I extract this information to use it in the SLAM update() function? Somehow I want to extract the information and extend the update function so it takes position from odometry, easily done.
bool update(const DataContainer& dataContainer, const Eigen::Vector3f& poseHintWorld, bool map_without_matching = false, bool force_update = false, const Eigen::Vector3f& odometryPoseWorld)
But I don't know how to "extract" the odometry information and use it in the scanCallback function. I've linked it here since I couldn't get the formatting right. Any suggestions?
Update 2 I got the whole thing working now, I've tested with a few corridors and so far only been seeing a very big improvement! I will post pictures of the final maps later. I have some CPU issues that needs resolving first :)
Update 3 I can't recreate the results probably suggests that the implementation is very unstable, any suggestions are very very welcome :(
Update 4 I want to use the robot_localization package, but since I'm using Fuerte this is not possible? Since its not supported for this distro? The only semi-succesful result I am able to create. The robot relies completely on odometry
Asked by JamesWell on 2016-03-22 03:57:04 UTC
Answers
You can modify HectorMappingRos to subscribe to odom topic which contains the pos data of the robot. Then pass this data (which should come at fixed interval) to slamProcessor either in the same update function or in a separate function.
Asked by DavidN on 2016-03-22 05:47:04 UTC
Comments
I've tried this but have had a little trouble knowing where to start. Though I am making progress now, will report back, thank you!
Asked by JamesWell on 2016-03-22 07:11:38 UTC
@JamesWell, any update?
Asked by DavidN on 2016-03-29 06:35:56 UTC
Right now I'm just using the poseHintWorld to rely only on odometry, which creates a nearly perfect map (since it's in simulation). The next step is to fuse the estimated position from odometry and the estimated position from the laser scanner. Starting with a very simple "hardcoded" filter.
Asked by JamesWell on 2016-03-29 10:56:41 UTC
@DavidN I ended up just setting using "use_tf_pose_start_estimate = true". It maps long corridors but they are very skewed due to IMU drift.
Asked by JamesWell on 2016-06-02 06:06:14 UTC
Comments
I've upvoted you, so you can now probably upload the image.
Asked by gvdhoorn on 2016-03-22 04:16:24 UTC
Thank you!
Asked by JamesWell on 2016-03-22 04:47:52 UTC