ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
3

Hector SLAM with odometry switch

asked 2016-03-22 03:57:04 -0500

JamesWell gravatar image

updated 2016-03-29 00:30:43 -0500

I've spent a good amount of time trying to find a solution to the Hector SLAM with long corridors localisation problem(if you found this through a Google search you know exactly what I'm talking about). Instead of doing sensor fusion with the odometry of my robot I’ve decided to do a switch as soon as parameters in the scan matching covariance in Hector SLAM start showing erratic behaviour. The variable is called lastScanMatchCov. In my testing I found that the middle value of the covariance matrix spikes in long corridor scenarios, the spike occurs early enough for the odometry “switch” to trigger.

The plot shows the middle element in different scenarios, N are the non-long-corridor environments while C are the long-corridor environments. As seen in the plot all the long-corridor environments yield a great spike, this is used as the trigger to use odometry.

image description

When the switch triggers I want to use the odometry pose estimate instead of the current poseHintWorld that the laser scan matching algorithm uses. I know that the odometry is being published as /odom topic, and rostopic echoing this topic gives me updates on the position. My problem is how do I get the odometry position estimate? I’m fairly new to ROS and I’ve finished the subscriber/publisher tutorial, but still can’t seem to get a hole through. Here are the code parts I want to edit in HectorSlamProcessor.h

bool update(const DataContainer& dataContainer, const Eigen::Vector3f& poseHintWorld, bool map_without_matching = false, bool force_update = false)
{

  bool ret = false;

  Eigen::Vector3f newPoseEstimateWorld;

  if (!map_without_matching){

    //middle value of 3x3 covariance matrix
    if (lastScanMatchCov(1,1) > 100.0)
    {
        std::cout <<"\nunsure!\n";
       //USE DIFFERENT POSE ESTIMATE
       //poseHintWorld = "position estimate from /odom topic"
       //newPoseEstimateWorld = poseHintWorld
    }
    newPoseEstimateWorld = (mapRep->matchData(poseHintWorld, dataContainer, lastScanMatchCov));
    }else{
    newPoseEstimateWorld = poseHintWorld;
    }



  lastScanMatchPose = newPoseEstimateWorld;
  //std::cout << "\nt1:\n" << newPoseEstimateWorld << "\n";

  //std::cout << "\n1";
  //std::cout << "\n" << lastScanMatchPose << "\n";

  if(util::poseDifferenceLargerThan(newPoseEstimateWorld, lastMapUpdatePose, paramMinDistanceDiffForMapUpdate, paramMinAngleDiffForMapUpdate) || map_without_matching || force_update){
    ret = true;
}

Hope you guys can help me out.

Edit: I should mention that the map is generated in a very harsh environment for Hector SLAM.

  1. The corridors are way longer than the range of the scanner, eliminating the possibility of the scanners seeing the wall at the end of the corridor.
  2. The map is drawn such that the walls along the corridor have absolutely 0 features. This is done because the robots have a really good odometry estimate by combining IMU and odometry.
  3. The solution must be a variant of Hector SLAM, this is not my decision, these are just my instructions :)

Update I'm now able to extract the position data of the robot with the help of a subscriber

void HectorMappingRos::odomCallback(const nav_msgs::Odometry::ConstPtr& msg)
{
Eigen::Vector3f odometry_pose;
odometry_pose(0) = msg->pose.pose.position.x;
odometry_pose(1) = msg->pose.pose.position.y;
odometry_pose(2) = msg->pose.pose.orientation.z;
std::cout << "\nodometry_pose_vector\n" << odometry_pose << "\n";

}

But how do I extract this information to use it in the SLAM update ... (more)

edit retag flag offensive close merge delete

Comments

1

I've upvoted you, so you can now probably upload the image.

gvdhoorn gravatar image gvdhoorn  ( 2016-03-22 04:16:24 -0500 )edit

Thank you!

JamesWell gravatar image JamesWell  ( 2016-03-22 04:47:52 -0500 )edit

1 Answer

Sort by » oldest newest most voted
1

answered 2016-03-22 05:47:04 -0500

DavidN gravatar image

You can modify HectorMappingRos to subscribe to odom topic which contains the pos data of the robot. Then pass this data (which should come at fixed interval) to slamProcessor either in the same update function or in a separate function.

edit flag offensive delete link more

Comments

I've tried this but have had a little trouble knowing where to start. Though I am making progress now, will report back, thank you!

JamesWell gravatar image JamesWell  ( 2016-03-22 07:11:38 -0500 )edit
1

@JamesWell, any update?

DavidN gravatar image DavidN  ( 2016-03-29 06:35:56 -0500 )edit

Right now I'm just using the poseHintWorld to rely only on odometry, which creates a nearly perfect map (since it's in simulation). The next step is to fuse the estimated position from odometry and the estimated position from the laser scanner. Starting with a very simple "hardcoded" filter.

JamesWell gravatar image JamesWell  ( 2016-03-29 10:56:41 -0500 )edit

@DavidN I ended up just setting using "use_tf_pose_start_estimate = true". It maps long corridors but they are very skewed due to IMU drift.

JamesWell gravatar image JamesWell  ( 2016-06-02 06:06:14 -0500 )edit

Question Tools

4 followers

Stats

Asked: 2016-03-22 03:57:04 -0500

Seen: 1,041 times

Last updated: Mar 29 '16