Following another uncontrolled drone using tf library ?
Hi,
I need some advice on how to proceed with the following -
Scenario: I have a Parrot A.R. drone that I'm controlling using ROS. I wish to follow another Drone ( mostly another Parrot A.R. drone ). Assuming that I can detect the other (threat) drone, which it is not under my control ( and hence I cannot make it publish its directions as suggested in the tf tutorials ).
Update: As it is difficult to fly 2 drones around physically, I am currently attempting to simulate both in Gazebo + tum_simulator package. Currently I have been able to setup the gazebo world with 2 drones and 1 stereo camera ( the white cube in the image )
My question:
- Would the tf library help in any way here ? If not then, assuming in a preliminary scenario ( not in the final scenario as mentioned above ), I can control the other drone to publish it's navigation information and then use tf library to follow it.
- Would anyone have a link to a quick tutorial on obtaining the point cloud information from the stereo cameras ( the cube in the image ). I wish to use rviz to visualize the point cloud of the environment.
If this ^ simulation is successful, I will be using an actual pair of stereo cameras (using triangulation, pose estimation in openCV etc ) to detect the navigation information for the second drone and then use that to guide the first drone using some form of closed loop control.
Would this strategy work ? I welcome any suggestions that you may have.