ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

ardrone2.0s coordinate system

asked 2018-01-26 03:41:23 -0500

Steve_RosUsr gravatar image

updated 2018-02-13 01:53:35 -0500

Hy community!

I am writing my thesis at the university about Localization and navigation of ARDrone 2.0. I have read all of the available topics and documentations for the ros support of ardrone, but i have a problem which i could not solve yet.

I know, that i can get the position data from ardrone's /ardrone/odometry topic. And i know that too, that if i get an msg object on that topic, the position data is in the msg->pose.pose.position field!

But can somebody explain to me HOW THIS POSITION DATA IS DETERMINED??? How does the drone's coordinate system works? HOw can i get this coordinate system?

This is important to me, because i have an indoor localization system, and with the help of it i plan the start and end point of the path. But thanks to the turbulance and the noise of the drone, the provided data of the localization system is very noisy during the "travelling".

So i figured out that i have to transform the indoor localization data (start coordinate, end coordinate, orientation) into the drone's coordinate system. But this is not clear to me how this inner coordinate system works. And i could not find any info about it either ... yet.

So i hope you could help me! This is very important, because it is the last step of my work to finish it! My mathematic model is perfect, and i coded it properly. But the perception of "did the drone reached the destination" doesn't perfect.

I'm using Indigo Igloo in Ubuntu 14.04 virtual machine! And ardrone_autonomy as well.

Thank you in advance! Yours, Steve


Update:

Thank you for answering this fast lads! I appreciate it!

So is there any way to "visualize" / draw up any kind of coordinate system to my calculations? I would like to do some kind of "coordinate transformation" on my own in order to get the destination coordinate in the Drone's coordinate system.

The aim of this calculation is to supervise the drone's travelling to the destination point: if the drone not turned enough and started to travel to a wrong direction, i could shut down the process automatically. OR if the drone has arrived to the destination's environment (for example 30 cm distance from that point), i could send a signal to the drone which tells it to STOP, YOU JUST ARRIVED TO YOUR DESTINATION!


Update2

I am so sorry that ARDrone autonomy's documentation is so imperfect. I mean the coordinate frames chapter is only 1 page long... And there is not any concrete documentation of how drone frames is related to each other, how or when odom tf is set up... Why do i have to take the time to figure out such important things like that???

Why can't be just one man or a team who write the documentations RESPONSIBLY? It is very annoying!!!

edit retag flag offensive close merge delete

Comments

1

@Steve_RosUsr: it's considered bad form to say something like "this is urgent" or to give deadlines so I removed that from your question.

jayess gravatar image jayess  ( 2018-01-26 15:23:28 -0500 )edit
1

@Steve_RosUsr: I deleted your "answer" and moved the text to your question. Please don't use answers to provide more information. This isn't a forum.

jayess gravatar image jayess  ( 2018-01-27 04:55:47 -0500 )edit

Many of the projects that are available were made as part of a person's or groups research (such as ardrone_autonomy). The project was then graciously released to the community so that everyone can benefit. We should feel fortunate that we can "stand on others' shoulders" and focus on our ...

jayess gravatar image jayess  ( 2018-02-13 02:03:54 -0500 )edit

research and not have to write drivers and other software that isn't directly related to our research/job. Sometimes documentation isn't the best (ardrone_autonomy's is quite good actually), but since this is open source you can... read the source!

jayess gravatar image jayess  ( 2018-02-13 02:05:32 -0500 )edit
1

There's also plenty of books, articles, and documentation (and this site) that explain a lot of ROS (which is what you're having trouble with). ROS has a very large ecosystem and takes a lot of reading (including source code) to understand how everything works. Good luck and keep asking questions!

jayess gravatar image jayess  ( 2018-02-13 02:07:44 -0500 )edit
1

@jayess: we should be glad we can re-use, but a dump of source code with an implicit "but you can read the source" is obviously not very helpful. The ardrone pkgs are not like that, but I can understand the frustration of the OP.

In this case, I believe the authors expect (whether that is ..

gvdhoorn gravatar image gvdhoorn  ( 2018-02-13 02:16:35 -0500 )edit

.. ok or not, I don't know) a certain level of understanding from their users, and they show a TF tree + add a link to REP-103. For someone 'in the know' wrt ROS tf frames (and probably drones), this should be sufficient to understand what is going on.

For newcomers this might not be enough.

gvdhoorn gravatar image gvdhoorn  ( 2018-02-13 02:17:51 -0500 )edit

@gvdhoorn: Agreed. An answer of "read the source" isn't productive nor what I was arguing for. What I was arguing for was instead to remember that many of the packages that we use aren't written by big companies with time/budgets to write extensive documentation. And, that there are many ...

jayess gravatar image jayess  ( 2018-02-13 02:33:54 -0500 )edit

2 Answers

Sort by ยป oldest newest most voted
2

answered 2018-01-26 15:32:14 -0500

jayess gravatar image

updated 2018-01-27 05:01:40 -0500

The coordinate frames for ardrone_autonomy are given in the documentation and conform to the conventions given in REP-103.

As for how the odometry is calculated, again, from the documentation:

Odometry data

New in version 1.4.

The driver calculates and publishes Odometry data by integrating velocity estimates reported by the drone (which is based on optical flow). The data is published as nav_msgs/Odometry messages to ardrone/odometry topic. The corresponding TF transform is also published as odom -> base transformation.

So, basically the odometry is calculated using dead reckoning. It's noisy because that's just the nature of dead reckoning.


Update:

You can add a tf frame and visualize that in RViz. Please refer to the tf tutorials to learn how to do that.

edit flag offensive delete link more

Comments

Thank you! :) Just one more question, because it is difficult for me to understand, and i am confused right now: is there already an odom tf frame and base tf frame for the drone? How can i access them directly?

Steve_RosUsr gravatar image Steve_RosUsr  ( 2018-01-29 08:23:58 -0500 )edit
1

If you look at the first link in the answer you'll find the tf tree that the driver creates. You can also examine it by using rqt_tf_tree while the driver is running.

jayess gravatar image jayess  ( 2018-01-29 12:50:16 -0500 )edit

So somehow i have to create a tf frame for the indoor localization system, and after that this transformation will be easy.

Steve_RosUsr gravatar image Steve_RosUsr  ( 2018-01-30 02:49:05 -0500 )edit
1

It depends on your setup. We have a VICON system and use the vrpn_client_ros package which can broadcast transformations for you.

jayess gravatar image jayess  ( 2018-01-30 03:04:08 -0500 )edit

I use Marvelmind indoor ultrasonic localization system.

Repository link

ROS integration

Steve_RosUsr gravatar image Steve_RosUsr  ( 2018-01-30 04:50:17 -0500 )edit

I don't know about that package. Go through the tf tutorials that I linked to and see if you can do it if that package doesn't provide the transformation for you. You can ask another question about that package and maybe someone will know about it.

jayess gravatar image jayess  ( 2018-01-30 10:12:42 -0500 )edit

I have just asked them too :) thanks for the help lads!

Steve_RosUsr gravatar image Steve_RosUsr  ( 2018-01-31 02:48:09 -0500 )edit
1

answered 2018-01-26 07:16:45 -0500

rdelgadov gravatar image

Hello Steve,

I think it can be very useful for you to use the tum_vision package that gives a more accurate state estimate. In the package documentation ardrone_autonomy says that the odometry depends on the speed and the speed is in meters / seconds, so I think it could be in meters (I used it last year but I forgot the units :c).

A useful suggestion is that the cmd_velocity command is not in meters / second, it is a percentage of the maximum inclination angle in the intuitive reference system, the roll inclination is in linear.y, the pitch is in linear.x and the yaw is in angular.z. I hope this information is useful for you.

edit flag offensive delete link more

Comments

I tried this package before, does not worked... And this is not the best solution for me, because there are lot of unnecessary (for me) nodes, which need a lot of resources... and i don't need unnecessary picture processing algorithms :(

Steve_RosUsr gravatar image Steve_RosUsr  ( 2018-01-26 07:44:18 -0500 )edit

I understand the problem, in that context, the position is calculated with the velocity gived by the IMU sensor multiply for the time between each update (like 200Hz say tum_vision group in his papers) so the accuracy is very low (the position is in meter i found the information in my computer!)

rdelgadov gravatar image rdelgadov  ( 2018-01-26 08:00:17 -0500 )edit

Question Tools

3 followers

Stats

Asked: 2018-01-26 03:41:23 -0500

Seen: 596 times

Last updated: Feb 13 '18