ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

localization of robot/kinect within saved octomap

asked 2012-04-04 11:58:58 -0500

Scott gravatar image

updated 2016-10-24 09:02:18 -0500

ngrennan gravatar image

Hi, I have been able to successfully use the rgbdslam package with a kinect to create a 3D scan and send it to octomap. After that I am able to save it as a .bt file and visualize it with octovis. Question: How can I use the saved octomap to localize my robot/kinect? Thanks in advance. -Scott

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2012-04-05 08:26:05 -0500

AHornung gravatar image

updated 2012-09-25 03:48:35 -0500

You'll have many options for that, but afaik no working solution out of the box. You can e.g. use scan matching with your point clouds to the known volumetric octomap model, but that's quite involved and computationally expensive.

I did some 3D localization with laser, IMU, and odometry on a humanoid in a known OctoMap model, you can find details in the paper "Humanoid Robot Localization in Complex Indoor Environments". As measurement model, you can immediately apply the ray casting functionality in octomap ("castRay" in OcTree) although I would suggest exploiting parallelization with OpenMP and downsampling the full point cloud.

Edit: I just published the localization code at http://ros.org/wiki/humanoid_localization

It's mostly designated for humanoid robots and still being polished right now, but I'm sure you can use much of the sensor model code as example.

edit flag offensive delete link more

Question Tools

2 followers

Stats

Asked: 2012-04-04 11:58:58 -0500

Seen: 1,355 times

Last updated: Sep 25 '12