ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
2

How to create a 180 degree laserscan from a panning Kinect?

asked 2011-04-19 18:09:01 -0500

Bart gravatar image

updated 2016-10-24 08:59:15 -0500

ngrennan gravatar image

The Kinect has a narrow (57 degree) field of view, which is limited for obstacle avoidance, navigation and map making as pointed out elsewhere by others. My Kinect is mounted on a pan/tilt mechanism so I should be able to pan the laser and create a wider simulated laserscan. (but with an admittedly lower scan rate than a real laser) An advantage however is that the Kinect should be able to identify obstacles the full height of the robot, rather than just at the laser elevation. Does anyone have experience or example code available for this task?

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
2

answered 2011-04-19 18:12:38 -0500

Bart gravatar image

Based on some help in previous ROS questions/answers I have developed a nodelet "pipeline" that augments the pointcloud_to_laserscan package provided in turtlebot. This pipeline takes the pointcloud from "cloud_throttle.cpp" and tranforms it geometrically based on pan and tilt angles broadcast to tf using pcl_ros::transformPointCloud in a new nodelet "cloud_to_scan_tf.cpp". It then uses a third nodelet "scan_to_wide.cpp" to overlay a sequence of panned narrow scans into a wider 180 degree scan. The result is a simulation of a wider field of view, horizontal mounted, forward facing, planer laser.

The "cloud_to_scan_tf.cpp" nodelet was documented previously as "cloud_to_scanHoriz.cpp" in the "pointcloud to laserscan with transform?" ROS Answers question.

Attached below is the code for the "scan_to_wide" nodelet and a typical launch file. There is a Bool message to start/stop the panning. This nodelet sends a Float32 message to a hardware interface program to move the pan servo and receives a JointState message from the hardware interface program to indicate the pan servo position. The Kinect is tilted downwards 25 degrees to improve visibility just in front of the robot, but tf corrects the laserscan to horizontal. When the robot is moving the Kinect is stationary facing forward. When the robot stops it does a 180 degree pan, which takes about 3 seconds. The Kinect nodelets and other hardware interface programs run on a small netbook (Atom N270 1.6 GHz) at 100% CPU, but only using 20 KB/sec of wireless network bandwidth to display the laserscan on rviz on a remote desktop computer.

I'm interested if anyone has suggestions for improvement or experience integrating a panning Kinect with the navigation stack. As the FAQ suggests, a longer discussion should be moved to the mailing list.

/*
 * Copyright (c) 2010, Willow Garage, Inc.
 * All rights reserved.
 *
 * Redistribution and use in source and binary forms, with or without
 * modification, are permitted provided that the following conditions are met:
 *
 *     * Redistributions of source code must retain the above copyright
 *       notice, this list of conditions and the following disclaimer.
 *     * Redistributions in binary form must reproduce the above copyrigh

t
 *       notice, this list of conditions and the following disclaimer in the
 *       documentation and/or other materials provided with the distribution.
 *     * Neither the name of the Willow Garage, Inc. nor the names of its
 *       contributors may be used to endorse or promote products derived from
 *       this software without specific prior written permission.
 *
 * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
 * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
 * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
 * ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
 * LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
 * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
 * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
 * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
 * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
 * ARISING IN ANY ...
(more)
edit flag offensive delete link more

Comments

2
I can report that a panning Kinect to laserscan can work well with gmapping. I only update the map when a new scan comes in and only send a wide scan when stationary, after a scan is completed. The scan matching localization in gmapping is impressive. Compensates for poor odometry.
Bart gravatar image Bart  ( 2011-04-23 07:08:37 -0500 )edit

Question Tools

Stats

Asked: 2011-04-19 18:09:01 -0500

Seen: 1,304 times

Last updated: Apr 19 '11