Robotics StackExchange | Archived questions

How to find a flat landing spot for UAV?

Hello ROS Enthusiasts, for my project i am trying to autonomously control px4 (UAV) over a mountainious terrain. I have written a code for this with bunch of different waypoints which enables the UAV to travel over this region.

Now my next task is to find a flat landing spot for the UAV over this mountain ranges using well equipped sensors. After thinking, searching a lot, i am coming up short every single time and hence i need your help. My first thought was using rtabmap_ros package to create point clouds and get the 3D Mapping of the given terrain. But what next?

Any Idea how i can implement this in gazebo simulation?

Thanks.

Asked by Yash on 2019-06-02 01:12:16 UTC

Comments

(commenting mostly to follow the discussion) wow that's a really interesting problem. I'm not going to be able to give you a good "here's how to do it" answer but I can ask some questions and hopefully help you figure out some options.

What sensors do you have access to? Do you have any prior knowledge of the areas you're flying in? Do you have any specific landing areas in mind that you're targeting and needing to check specific spots within it to land?

Asked by stevemacenski on 2019-06-02 01:24:33 UTC

Hey thanks for joining the discussion.

  1. It is mostly simulation hence, i can use any sensor i want (provided it is compatible with px4 and gazebo). I was thinking of using a VeloDyne Sensor.

  2. I sort of had a collada file of the images of the region which i imported in gazebo by writing a sdf script. So, to answer the question, no and yes. It is kinda tricky to have knowledge of the entire area. But i was hoping that the UAV will do this task for me by creating a map out of the 3D world from gazebo (this mostly includes mountainious terrains, gradients etc.)

  3. I do not really know how to really start looking for the landing area because i do not know how to find a flat surface in that region.

Asked by Yash on 2019-06-02 02:08:16 UTC

I can describe I suppose one way you could approach this from a very high level. I dont know off the top of my head the best tools for each step but I know they all exist out there:

If you have a 3D lidar on a drone, you can pretty "easily" (meaning largely solved to the degree you need) create a 3D map of your environment made of points. Something like LOAM or some 3D SLAM if you dont have really accurate GPS/INS.

If you have a 3D map of points, you can determine the normals of the ground regions in the area you're generally wanting to land in.

With a set of normals, you should be able to find a clustered region of normals large enough for you to successfully land, both in size and having a ground gradient sufficient for your purposes (flat).

How to do that from images in Gazebo? That sounds like you may need to create a model

Asked by stevemacenski on 2019-06-02 02:16:30 UTC

That's not going to cover things like making sure you don't land in a tree or a river, but its a start

Asked by stevemacenski on 2019-06-02 02:19:30 UTC

I have created a model out of the images. Also this might sound a bit dumb but what exactly do you mean by 'determine normals of the ground regions'? Also, Can i use LOAM with ROS Kinetic? Last i checked, i thought that is used with Indigo. Please, correct me if i am wrong.

Asked by Yash on 2019-06-02 02:23:46 UTC

If you have a set of points from a pointcloud in some map, you can use those 3D points and their neighbors to determine the normal vector of a point and region. The you could conceivably cluster sections of normals that you could land on until you found one large enough to work.

Indigo and Kinetic really weren't that far apart. My guess is it'll compile out of the box and if not with very little effort. I just used that as one example of a 3D slam implementation. If you're working in simulation and you just "assume" prefect knowledge for this demonstration, you could just buffer these points into a grid/raw data as your "map"

Asked by stevemacenski on 2019-06-02 02:27:06 UTC

Okay i will try to do that. If i have some doubts i will continue this discussion here. Thank you so much!

Asked by Yash on 2019-06-02 03:36:01 UTC

Answers

As has been mentioned you will at some point be generating a point cloud map of the terrain using your sensors and some type of front end processing. This could be done using RTABMAP and a good RGB-D sensor, or a scanning lidar and motion estimation system. There are few options here, what I'm going to describe is a pipeline to go from a point-cloud to a set of safe landing spots.

Firstly point-clouds are great but they're very computationally expensive to work with, given that you're working with outdoor terrain I recommend converting the point-cloud to a digital elevation map (DEM) first. You need to choose a suitable cell size based on the size of flat landing area you need, I'd guess a 10th of the flat radius you need would be a good start. You'll then generate DEM which contains the minimum and maximum elevations of each cell, this is important because it allows you to distinguish between flat, areas and foliage etc.

When you have the DEM calculated it relatively simple to search across the 2D data and find and areas large enough that are also smooth and horizontal enough to land on.

There are other potential hazards such as bodies of water, vehicles and roads that may have to be considered depending on your application. You might not want to land on the roof of a car for example, but that's a trickier one to avoid!

Hope this helps.

Asked by PeteBlackerThe3rd on 2019-06-02 06:12:19 UTC

Comments

Okay i will try to do that. But if i get stuck somewhere i will continue the discussion on this thread.

Asked by Yash on 2019-06-02 06:29:43 UTC

You might not want to land on the roof of a car for example

that's actually the goal of some challenges involving drones and autonomous landing. MBZIRC being one of them.

Asked by gvdhoorn on 2019-06-02 06:34:04 UTC

Definitely best with permission of the owner!

Asked by PeteBlackerThe3rd on 2019-06-02 07:04:15 UTC