Help with streaming AR.Drone camera images to openCV
Hello all.
I am using the AR.Drone 2.0 and setup the ardrone_autonomy driver link:text and I also installed ROS from the official website (fuerte). I am using Ubuntu 12.0.4 also.
What I would like to do is be able to write a program that takes the camera stream from the AR.Drone 2.0 and sends them to openCV. openCV is installed with my ROS distribution inside the vision_opencv folder (it has cv_bridge and openCV2 which is what I would like to use).
I have played around with openCV2 from the openCV website for single images so openCV seems to work well. I am slightly confused on how to solve the above problem because I am new to ROS and openCV. I have followed the tutorials but they do not teach me how to deal with this problem as far as I understand.
I think there are two problems to what I want to do. 1) send camera images directly from the AR.Drone to ROS. 2) send images from ROS format to openCV and do this the other way around. I read that cv_bridge does this but slightly overwhelmed by it.
Ideally, I want a GUI with the images going into openCV but how will I know if the images are successfully going into openCV? openCV is just a package and doesnt seem to have an interface like MATLAB so I want to also understand what kind of output I should expect to know if the program is working correctly. Any help will be great!
How are the images currently streamed? Can it be used as a sort of IP cam? OpenCV has support for IP cams.
Currently, the images stream using Qt from the code provided by the ardrone_tutorials. The ardrone sends the images through the generates an ad-Hoc wifi. What does IP stand for?
Do you want to get the stream from the AR.Drone in order to process it with OpenCV?