Ask Your Question
2

Kinect - How to get a consistent set of data ?

asked 2011-05-02 07:55:10 -0600

Matthieu gravatar image

updated 2016-10-24 09:05:59 -0600

ngrennan gravatar image

Hi,

I'm currently writing a program which is to use data from Kinect in order to perform image processing. I'm using the OpenNI-Kinect driver with Ubuntu 10.10 and ROS Diamondback.

The image processing needs color image, depth image and point cloud data, then I'm subscribing to the topics "camera/rgb/image_color", "camera/depth/image" and "camera/depth/points", which allows me to save pointers to the data, and - if needed - to save the different data as images or binary files.

My problem is that I don't know how to get a consistent set of these data, i.e. how to get RGB, depth and point cloud data which have the same time stamp.

The program is looping with a while(true) which contains:

  • a cvWaitKey(1) to set the saving and to exit the loop,
  • and a ros::spinOnce() to actually loop.

For the moment the program saves point cloud and depth data which have the same time stamp but the RGB never matches. Would you see how to get a RGB captured at the same moment ?

edit retag flag offensive close merge delete

3 Answers

Sort by ยป oldest newest most voted
3

answered 2011-05-02 19:24:12 -0600

You can use the approximate time syncronization policy, which mostly gets the right data bundle. Depth image and point cloud should have the exact same time stamp, the rgb image shouldn't be off more than 1/30 sec. I have recently introduced a filter that checks for time offsets in rgbdslam, and does get triggered every now and then.

Btw, better loop with while(ros::ok()){ ... } instead of while(true). If your node gets killed, your program will stop.

edit flag offensive delete link more
3

answered 2011-05-02 08:45:30 -0600

In general you can't expect to have perfect synchronization between RGB and depth for the Kinect. From the openni_camera wiki page:

Synchronization

Other than the Primesense devices, the Kinect does not support hardware synchronization. Thus the time points where the RGB image and the depth image are captured may be up to ~16ms away from each other. For all devices we use a Approximatetimesynchronizer to generate the point clouds /camera/rgb/points. For Primesense devices the hardware synchronization is turned on automatically if both streams are used (subscribing to image + depth).

Note that /camera/depth/image and /camera/depth/points both contain essentially the same information--the former is the depth of each point and the latter is the point cloud derived from that depth data. Subscribing to both of them simultaneously is probably overkill.

edit flag offensive delete link more
0

answered 2015-09-10 12:32:37 -0600

I make a frame counter, the number depending on the frequency (16 fps) and took a picture of depth and color with the same frame number, yet the difference in depth color picture shows. my email is alyzart22@gmail.com

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

Stats

Asked: 2011-05-02 07:55:10 -0600

Seen: 2,065 times

Last updated: May 02 '11