Orientation (yaw) estimation using external cameras
Hi,
I am working on a project that involves using static and external (not on the drone) stereo cameras for obtaining the 3D coordinate of a drone. I am currently trying to obtain orientation (yaw only) estimate of the drone using a pair of external (manually) calibrated stereo cameras.
My Question:
Are there any existing ways to do this in ROS ? I have seen this tutorial from OpenCV (but haven't yet tried it on the drone) but wanted to know if anyone has used any existing ROS packages ? Thanks!
Wouldn't this be a task for one of the various visual odometry packages?
Doesnt visual odometry mean estimating robot motion from camera motion ? The stereo cameras here are static and just monitor the robot. Wil the visual odom packages like libvis2 still work ?
Yes, you're right. It's typically used to calculate motion of the camera relative to features observed. I missed the part where your cameras are observing your robot.
Why did you delete your question?
Oh, sorry .. may have happened by accident ... and I edited the question slightly to explain that I am using static and external cameras (was my fault for not mentioning it clearly).
How big is your flying area? The easiest way would be add some markers (e.g. aruco, artoolkit, ...)
yes, I will be adding them, but I am trying to use the stereo cameras as the main sensor for navigation. My flying area may be 5x5 m or larger, depending on how well the cameras perform.