Calculate relative 3D position from points in stereo images?
Hi everybody,
I have a (DIY) stereo camera readily calibrated so that it can be used with stereo_image_proc
which successfully and beautifully creates a point cloud. That's nice so far... but it's not the question.
In addition to the image data (RGB) I also have several key points in image coordinates (x, y) that are generated from the pictures of both, the left and the right eye of the stereo camera. I would like to match these points so that I can calculate the Z-component for each point. That is, I have x1, y1, x2, y2 for each corresponding point and I'd like to have a full blown relative (and later on through tf
absolute) x, y, z position for each point.
How would I go on about this? There's supposedly some equation that can make this calculation.
Thanks a lot!
//edit: oh, just to add to it: It is known which key point in the left corresponds to which one in the right image. That is, there is no feature matching necessary.
I'm slightly confused: you mention you already have pointclouds, so that implies that dense-stereo matching / processing is already working and disparity can be (successfully?) calculated. Are you now asking to do the same thing again, but for some specific features in both images?