ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

There are two (semi-independent) issues here: rviz and the kinect proper.

Fire up the openni drivers (without rviz), do rostopic hz /camera/rgb/points; this will tell you how quickly your machine can actually turn kinect data into point clouds. If that number is smaller than you need (and the CPU is spiked), the answer is that your computer isn't fast enough.

How quickly rviz can visualize those clouds is a different question, and is more about your graphics card. There is substantial CPU overhead in the serialize -> transmit -> deserialize step to get the data into rviz (note that a kinect point cloud, at frame rate, is about 300 MB/sec), which could also be a problem if just getting the data is already maxing your CPU.

Roughly circa the release of eturtle, two things will happen. First the new openni drivers (currently partway available as openni_camera_unstable) will have a "record player" mode, allowing you to store the raw depth and RGB images, and produce point clouds later, via bag playback (meaning you can slow everything down without losing data). You can already roll your own version of this, to some degree; I use something similar on my netbook-based robot. Second, the drivers will become nodelets, meaning you can do away with the serialize-transmit-deserialize overhead in your nodes (although not with rviz).

There are two (semi-independent) issues here: rviz and the kinect proper.

Fire up the openni drivers (without rviz), do rostopic hz /camera/rgb/points; this will tell you how quickly your machine can actually turn kinect data into point clouds. If that number is smaller than you need (and the CPU is spiked), the answer is that your computer isn't fast enough.

How quickly rviz can visualize those clouds is a different question, and is more about your graphics card. There is substantial CPU overhead in the serialize -> transmit -> deserialize step to get the data into rviz (note that a kinect point cloud, at frame rate, is about 300 MB/sec), which could also be a problem if just getting the data is already maxing your CPU.

If the problem is just GPU (rostopic hz gives you satisfactory speeds, and you have leftover CPU), you could apply a VoxelGrid Filter to intelligently downsample your pointcloud; that might help rviz keep up.

Roughly circa the release of eturtle, two things will happen. First the new openni drivers (currently partway available as openni_camera_unstable) will have a "record player" mode, allowing you to store the raw depth and RGB images, and produce point clouds later, via bag playback (meaning you can slow everything down without losing data). You can already roll your own version of this, to some degree; I use something similar on my netbook-based robot. Second, the drivers will become nodelets, meaning you can do away with the serialize-transmit-deserialize overhead in your nodes (although not with rviz).