Orientation (yaw) estimation using external cameras
Hi,
I am working on a project that involves using static and external (not on the drone) stereo cameras for obtaining the 3D coordinate of a drone. I am currently trying to obtain orientation (yaw only) estimate of the drone using a pair of external (manually) calibrated stereo cameras.
My Question:
Are there any existing ways to do this in ROS ? I have seen this tutorial from OpenCV (but haven't yet tried it on the drone) but wanted to know if anyone has used any existing ROS packages ? Thanks!
Asked by malharjajoo on 2018-04-13 15:41:29 UTC
Comments
Wouldn't this be a task for one of the various visual odometry packages?
Asked by gvdhoorn on 2018-04-14 00:51:17 UTC
Doesnt visual odometry mean estimating robot motion from camera motion ? The stereo cameras here are static and just monitor the robot. Wil the visual odom packages like libvis2 still work ?
Asked by malharjajoo on 2018-04-14 02:54:32 UTC
Yes, you're right. It's typically used to calculate motion of the camera relative to features observed. I missed the part where your cameras are observing your robot.
Asked by gvdhoorn on 2018-04-14 03:23:22 UTC
Why did you delete your question?
Asked by gvdhoorn on 2018-04-14 03:23:29 UTC
Oh, sorry .. may have happened by accident ... and I edited the question slightly to explain that I am using static and external cameras (was my fault for not mentioning it clearly).
Asked by malharjajoo on 2018-04-14 05:23:27 UTC
How big is your flying area? The easiest way would be add some markers (e.g. aruco, artoolkit, ...)
Asked by NEngelhard on 2018-04-14 07:38:36 UTC
yes, I will be adding them, but I am trying to use the stereo cameras as the main sensor for navigation. My flying area may be 5x5 m or larger, depending on how well the cameras perform.
Asked by malharjajoo on 2018-04-14 08:39:31 UTC