ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Hi there. There is now a tutorial available about this very question, which I just finished following successfully. The tutorial is at http://mirror.umd.edu/roswiki/openni_launch(2f)Tutorials(2f)BagRecordingPlayback.html

The approach described there is very similar to the answer given by Felix Endres. The main idea is again to capture the raw data and then run the post-processing during playback. For using the Kinect pre-registered images the suggestion is to run openni_launch and then record only the following channels:

  • camera/depth_registered/image_raw.
  • camera/depth_registered/camera_info.
  • camera/rgb/image_raw.
  • camera/rgb/camera_info.

For playback it is necessary to:

  • Set the /use_sim_time parameter to true.
  • Get rosbag to publish /clock.
  • Stop openni_launch from starting the OpenNI driver.

More details are at the source.

I would like to thank the author of the tutorial; very helpful.

I have some launch files which approximately follow this tutorial for recording sense data and playing it back, for fuerte: https://bitbucket.org/damienjadeduff/openni_bag/src

The two files you will want there, play.launch and record.launch do approximately what you'd think they should.