Ask Your Question

Revision history [back]

Split up the trajectory into line segments and do 3D line to point distance calculations for all and take the smallest one for every point of interest, optimize it to be smarter later if there are a lot of line segments. Store the points in a 3D matrix (OpenCV, Eigen, or a GPU 3d texture format) if it is acceptable to have them in a grid, and you want it to be dense.

Or store the values in a PCL pointcloud if you want them to be located anywhere at all, and pcl has sqrPointToLineDistance already. (And can PCL have a vector fields, a bunch of points with an xyz location each with an xyz vector, or would that require two parallel clouds?)

It seems like you could do it for thousands of points per second real time, don't bother with the dense matrix representations if that is good enough.

Split up the trajectory into line segments and do 3D line to point distance calculations for all and take the smallest one for every point of interest, optimize it to be smarter later if there are a lot of line segments. Store the points in a 3D matrix (OpenCV, Eigen, or a GPU 3d texture format) if it is acceptable to have them in a grid, and you want it to be dense.

Or store the values in a PCL pointcloud if you want them to be located anywhere at all, and pcl has sqrPointToLineDistance already. (And can PCL have a vector fields, a bunch of points with an xyz location each with an xyz vector, or would that require two parallel clouds?)

It seems like you could do it for thousands of points per second real time, don't bother with the precomputing dense matrix representations if that is good enough.

Split up the trajectory into line segments and do 3D line to point distance calculations for all and take the smallest one for every point of interest, optimize it to be smarter later if there are a lot of line segments. Store the points in a 3D matrix (OpenCV, Eigen, Eigen (?), or a GPU 3d texture format) if it is acceptable to have them in a grid, and you want it to be dense.

Or store the values in a PCL pointcloud if you want them to be located anywhere at all, and pcl has sqrPointToLineDistance already. (And can PCL have a vector fields, a bunch of points with an xyz location each with an xyz vector, or would that require two parallel clouds?)

It seems like you could do it for thousands of points per second real time, don't bother with precomputing dense matrix representations if that is good enough.

Split up the trajectory into line segments and do 3D line to point distance calculations for all and take the smallest one for every point of interest, optimize it to be smarter later if there are a lot of line segments. Store the points in a 3D matrix (OpenCV, Eigen (?), or a GPU 3d texture format) if it is acceptable to have them in a grid, and you want it to be dense.

Or store the values in a PCL pointcloud if you want them to be located anywhere at all, and pcl has sqrPointToLineDistance already. (And can PCL have a vector fields, a bunch of points with an xyz location each with an xyz vector, or would that require two parallel clouds?)

It seems like you could do it for thousands of points per second real time, don't bother with precomputing dense matrix representations if that is good enough.

Another thought- OpenCV has a way to do this with at least 2D mats, where you would take a black image, draw the trajectory onto it in white pixels, and then run distanceTransform on it. Maybe it works with 3D mats also?

Split up the trajectory into line segments and do 3D line to point distance calculations for all and take the smallest one for every point of interest, optimize it to be smarter later if there are a lot of line segments. Store the points in a 3D matrix (OpenCV, Eigen (?), or a GPU 3d texture format) if it is acceptable to have them in a grid, and you want it to be dense.

Or store the values in a PCL pointcloud if you want them to be located anywhere at all, and pcl has sqrPointToLineDistance already. (And can PCL have a vector fields, a bunch of points with an xyz location each with an xyz vector, or would that require two parallel clouds?)

It seems like you could do it for thousands of points per second real time, don't bother with precomputing dense matrix representations if that is good enough.

Another thought- OpenCV has a way to do this with at least 2D mats, where you would take a black image, draw the trajectory onto it in white pixels, and then run distanceTransform on it. Maybe it works with 3D mats also?