ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Following another uncontrolled drone using tf library ?

asked 2018-01-10 15:59:12 -0600

malharjajoo gravatar image

updated 2018-02-03 19:05:33 -0600

Hi,

I need some advice on how to proceed with the following -

Scenario: I have a Parrot A.R. drone that I'm controlling using ROS. I wish to follow another Drone ( mostly another Parrot A.R. drone ). Assuming that I can detect the other (threat) drone, which it is not under my control ( and hence I cannot make it publish its directions as suggested in the tf tutorials ).

Update: As it is difficult to fly 2 drones around physically, I am currently attempting to simulate both in Gazebo + tum_simulator package. Currently I have been able to setup the gazebo world with 2 drones and 1 stereo camera ( the white cube in the image )

image description

My question:

  1. Would the tf library help in any way here ? If not then, assuming in a preliminary scenario ( not in the final scenario as mentioned above ), I can control the other drone to publish it's navigation information and then use tf library to follow it.
  2. Would anyone have a link to a quick tutorial on obtaining the point cloud information from the stereo cameras ( the cube in the image ). I wish to use rviz to visualize the point cloud of the environment.

If this ^ simulation is successful, I will be using an actual pair of stereo cameras (using triangulation, pose estimation in openCV etc ) to detect the navigation information for the second drone and then use that to guide the first drone using some form of closed loop control.

Would this strategy work ? I welcome any suggestions that you may have.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
1

answered 2018-01-11 03:27:48 -0600

  1. The TF library would be one of the underlying ROS system's you will use to share location information between nodes, but it will not help you with the perception side of things.
  2. If you can get the target drone to broadcast it's GPS location to ROS using the TF system then it will give you a good starting point to develop your perception and path planning systems. This will be your 'true' pose to compare experimental pose estimates with to see how well the chaser done is doing.

I'd like to add that your project sounds challenging, are your drones expected to be flying outdoors? What sort of range are you hoping that your chaser drone will be able to detect and track the target drone at?

The best sensor for the job would be a high end LIDAR system, but that is going to set you back a very large amount of money. Stereopsis is an option but there will always be a range beyond which depth cannot be determined, so your chaser will have to be able to switch between knowing where the target is and just knowing which direction the target is in. Another challenge is that drones can move very fast, this could pose challenges for stereopsis because of motion blur and latency of the 3D estimate.

It sounds like a super interesting project though, so I wish you all the best with it. Let us know if you have any more questions.

edit flag offensive delete link more

Comments

Hi thanks for your answer ... I will get back to this in 2-3 days... Currently the project is planned in an indoor environment ( hence usage of stereo cameras for triangulation ), and the drones will be detected using Background subtraction methods ( so this means that background is static ).

malharjajoo gravatar image malharjajoo  ( 2018-01-11 04:45:28 -0600 )edit

@PeteBlackerThe3rd - Have you used LIDAR ? I have no idea about this but I'm interested in knowing if it will suit my purpose ( can you check the updated question as well ) for getting a point cloud representation of the environment ?

malharjajoo gravatar image malharjajoo  ( 2018-02-03 19:10:16 -0600 )edit

Question Tools

2 followers

Stats

Asked: 2018-01-10 15:59:12 -0600

Seen: 738 times

Last updated: Feb 03 '18