ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

conversion of depht image coordinates to world coordinates (u,v,d) to (x,y,z)

asked 2014-01-21 02:23:56 -0500

stfn gravatar image

Hi, I implemented a object detection classifier working on rgb and depth data (from openni_launch) which provides object center coordinates in the image plane (u,v). Now I want to convert those image coordinates (+depth) to world coordinates (resp. the 3D camera frame).

I know I could just use the (u,v) coordinates and the registered cloud to get the point stored at (u,v,) but sometimes there is a NaN at this point and I dont want to deal with that as it would require to look for the neighbours which biases the object center and stuff. Another drawback is that after some pcl-calculations (resampling, removing nans, calculating normals, removing major planes etc.) the cloud is not organized anymore.

Second I could use the /cameraInfo topic to get the intrinsics of my depth sensor in order to calculate the conversion by hand using the lens equation, but this is some trouble if you want to do it the correct way (considering all instrinsics of the camera matrix).

So ... as openni_launch already does this for the whole pointcloud-creation, is there a function or service offering this conversion in openni_launch, openni api, pcl api, ros_perception etc?

edit retag flag offensive close merge delete

3 Answers

Sort by » oldest newest most voted
4

answered 2014-01-21 05:18:57 -0500

updated 2014-01-21 05:19:39 -0500

image_geometry can do this very easily (unfortunately the API docs are broken right now).

image_geometry::PinholeCameraModel model_;
model_.fromCameraInfo(cam_info);
cv::Point3d point3d = projectPixelTo3dRay(point2d); //point2d is a cv::Point2d

This will give you a ray which you can then normalize and multiply by your depth to get the 3D point.

PCL probably has a way to do this too.

edit flag offensive delete link more

Comments

That should do, even though I'd expect openni_launch resp. rgbd_launch to offer that as well somewhere. Anyway, which */camera_info do I need then, rgb, or depth? And does this depend on weather depth_registered is set?

stfn gravatar image stfn  ( 2014-01-22 09:10:54 -0500 )edit

camera_info should be from whatever camera the detection came from. If it came from /camera/rgb/whatever, you should use /camera/rgb/camera_info. It's probably simplest to have depth registration on and do everything in /camera/depth_registered, but it shouldn't matter.

Dan Lazewatsky gravatar image Dan Lazewatsky  ( 2014-01-22 09:24:37 -0500 )edit

The solution you propose looks reasonable to me. But I wonder why the most of the documentation and discussion on the internet about how to get a 3D point from the depth image is different. They directly consider the depth value as z coordinate (in mm).

Pablo Iñigo Blasco gravatar image Pablo Iñigo Blasco  ( 2016-09-30 02:41:45 -0500 )edit

It looks like they are an orthogonal projection, while you are proposing a perspective projection, right?

Pablo Iñigo Blasco gravatar image Pablo Iñigo Blasco  ( 2016-09-30 05:53:58 -0500 )edit
1

answered 2014-01-21 03:14:47 -0500

Yuan gravatar image

updated 2014-01-21 03:15:15 -0500

Hi, [xnConvertProjectiveToRealWorld](https://github.com/OpenNI/OpenNI/blob/master/Source/OpenNI/XnOpenNI.cpp) from openni is what you need. PCL may have wrapper for it.

edit flag offensive delete link more
0

answered 2015-08-28 10:53:07 -0500

Yake gravatar image

hello there, I'm using different image(the rgb and depth),but I can't get the z correctly,it always 1. and the x, y from rgb and depth to 3d is not the same.

 cv::Point2d pixel_point(98, 296);

cout << "pixel_point (2D) = " << pixel_point << endl << endl;

  cv::Point3d xyz = cam_model_.projectPixelTo3dRay(pixel_point);

cout << "point (3D) = " << xyz << endl << endl; rgb

depth_image

edit flag offensive delete link more

Question Tools

3 followers

Stats

Asked: 2014-01-21 02:23:56 -0500

Seen: 4,583 times

Last updated: Aug 28 '15