ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
5

Navigation planning based on kinect data in 2.5D?

asked 2011-02-27 06:17:35 -0500

evanmj gravatar image

updated 2016-10-24 08:59:02 -0500

ngrennan gravatar image

I have a wheeled robot with a front mounted SICK laser scanner.

The recent addition of a kinect allows me to use the SICK laser for longer range nav planning and mapping, and the kinect for various 3d stuff.

So, my idea is to convert the kinect point cloud to laser scans at various heights (via pcl). Say for instance my SICK laser is at 8" above the ground. That will not tell me about a curb or some other obstacle that lies just under 8". So, if I were to map the appropriate Z value of the kinect's point cloud data to a new laser scan topic, I could then use it for navigation, and write some code to decide what to do at that Z level. A simple example would be to determine the height of an object that my robot could negotiate based on wheel size, and just slow it down to the appropriate speed. It could also check for height clearance when driving around by creating a laser scan that correlates to the highest Z value of the robot.

I see this being useful for quad copters... it is still not full 3D navigation, but it could allow for some decent object avoidance in the Z dimension by writing some code to determine which Z height has the most clear path.

My question is, is anyone using laser scans at various heights to evaluate navigation at different Z levels? Is the kinect2laser a viable solution, or is there a better way to do this?

I see this as a possible workaround for this problem.

edit retag flag offensive close merge delete

Comments

Why convert to laser scans in the first place? Much of the existing navigation stack can use PointClouds for sensory input.
Eric Perko gravatar image Eric Perko  ( 2011-02-27 06:49:56 -0500 )edit
I had the same idea with a master thesis student at our lab (Enea Scioni, also following the list), however he left back to Italy where he's doing a PhD now, perhaps he continued the work. The idea we had is to include Z acquisition information in a map, so that robots could reason how to interpret maps that where acquired by other types of robots. I also think this could be useful to quadrotors as long as you can encode it in a memory optimized way. Now the data coming from the kinect is to large to build full 3D maps on the limited onboard resources.
KoenBuys gravatar image KoenBuys  ( 2011-02-27 17:07:37 -0500 )edit
Have you considered using a Voxel Grid? Costmap2d offers that option.
joq gravatar image joq  ( 2011-03-06 02:32:37 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
6

answered 2011-03-06 06:15:20 -0500

fergs gravatar image

updated 2011-03-06 06:17:23 -0500

The navigation stack will support most of your use case out of the box -- because you have a true long-range laser for localization (which that other problem you reference did not). You might note that this is nearly identical forms of input as the PR2 uses, which has a Hokuyo laser on the base for localization and obstacle avoidance, and a stereo camera rig for additional obstacle avoidance in 3d.

When configuring your robot's launch and parameter files, use only the SICK laser as input to AMCL for localization. Then use both the SICK and the Kinect data as observation sources for the local costmap (see http://www.ros.org/wiki/costmap_2d for details on parameters).

It might also be advisable to setup a voxel_grid to downsample and clean your Kinect data before sending it into the costmap_2d. The pcl_ros contains a nodelet that can do such, so you could configure it entirely in a launch file without any new custom code (see http://www.ros.org/wiki/pcl_ros/Tutorials/VoxelGrid%20filtering for details)

edit flag offensive delete link more

Comments

Hi im could you accomplish the navigation using gmapping?? any help would be really apreciated

ctguell gravatar image ctguell  ( 2013-07-28 20:32:32 -0500 )edit

Question Tools

5 followers

Stats

Asked: 2011-02-27 06:17:35 -0500

Seen: 2,205 times

Last updated: Mar 06 '11