Is it possible to calibrate the OpenNI depth "raw 16-bit value to mm" equation? [closed]

asked 2013-02-20 02:21:17 -0500

Mivia gravatar image

I am currently looking at using ROS with ASUS Xtion PRO LIVE sensors to determine positions of objects. It is thus important to have as precise depth distance measurements as possible.

From what I've read so far, I can understand that the equation to calculate a distance in mm from the raw 16-bit sensor values is based on parameters provided by the manufacturer. I read that these parameters are available from flash memory in the sensor itself ( https://groups.google.com/forum/?fromgroups=#!topic/openni-dev/R4-RZ6zDB1k ).

I don't know whether the sensors are calibrated individually, but my current 2 Xtion LIVE PRO sensors have very different accuracy. For both sensors I have a offset between sensor-detected and real-world distance. For one sensor this offset is 15mm at 1m distance, and for the other it is 56mm at 1m.

Since I am able to find linear relations between error and distance for each sensor, it is possible to create more accurate equations for calculating distance from the raw values, than what is achievable with the current (uncalibrated) equations.

Searching for a standardized method to do this calibration in ROS has led to nothing so far. I am amazed that no-one else seem to have been stuck with this problem... Have I been extremely unlucky with my sensors, or are everybody else just going with factory calibrations even though they are far from optimal?

Maybe my problem is that the same equation is used for both of my sensors? (since Xtion devices apparently are hard to distinguish: http://answers.ros.org/question/51908/xtion-empty-device-id/#54333 )

Maybe there is a problem getting the factory calibration from the flash memory of the sensors?

Perhaps the calibration is not done for each individual unit, and I have just been more unlucky with one of my sensors than the other?

Anyway, I would be able to solve the issue if I had a way to change the raw (16-bit) -> distance (mm) equation. Are these equation parameters set in the ROS wrapper, or in the OpenNI driver? And does anyone know where to modify these? (preferably via a config file, but a hint about which source file is also a good step on the way)

I have done thorough calibration of the cameras for focal length, lens distortion, etc.

Last resort would be to re-process all point clouds and correct these in PCL. I would by all means hope to avoid this as it is as processing heavy, dirty workaround.

Any ideas anyone?

edit retag flag offensive reopen merge delete

Closed for the following reason question is not relevant or outdated by tfoote
close date 2015-10-30 17:55:27.181162

Comments

I'm also interested in this. I took the approach of reprocessing, but at the level of the depth image. The downside is extra processing time, the upside is that you can do it per-pixel with different equation coefficients.

Ivan Dryanovski gravatar imageIvan Dryanovski ( 2013-02-20 02:59:29 -0500 )edit

Good to know that I'm not the only one with this problem. Just to make sure I understand your workaround correct... You take the depth/image_raw and process that into a point cloud? Seems reasonable, in case we cannot get access to the equation directly...

Mivia gravatar imageMivia ( 2013-02-20 03:15:24 -0500 )edit

I take the depth/image_raw, run it through my calibration model, and output another depth image. From there, I can build a point cloud if I need to (using depth_image_proc, or my own proc nodelets). It made a very noticeable difference in the visual odometry and map quality.

Ivan Dryanovski gravatar imageIvan Dryanovski ( 2013-02-20 04:17:51 -0500 )edit

Thx. I think I'll have to come up with similar a workaround, as i don't really expect a better alternative. If you continuously process depth/image_raw, can you then still get access to the rgb image at the same time? I know that I cannot stream both ir and rgb images, but maybe this is different?

Mivia gravatar imageMivia ( 2013-02-21 02:46:52 -0500 )edit

Yes, the depth image and rgb image are streamed at the same time.

Ivan Dryanovski gravatar imageIvan Dryanovski ( 2013-02-21 03:20:10 -0500 )edit

I'm facing the same problem. Can you give some insight on you calibration model / calibration? From my experiments the distortion is dependent on depth

AReimann gravatar imageAReimann ( 2015-01-14 01:23:27 -0500 )edit