Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

ardrone2.0s coordinate system

THIS IS VERY URGENT ISSUE, PLS HELP ME IMMEDIATELY IF YOU CAN! THANKS IN ADVANCE!!! Hy community!

I am writing my thesis at the university about Localization and navigation of ARDrone 2.0. I have read all of the available topics and documentations for the ros support of ardrone, but i have a problem which i could not solve yet.

I know, that i can get the position data from ardrone's /ardrone/odometry topic. And i know that too, that if i get an msg object on that topic, the position data is in the msg->pose.pose.position field!

But can somebody explain to me HOW THIS POSITION DATA IS DETERMINED??? How does the drone's coordinate system works? HOw can i get this coordinate system?

This is important to me, because i have an indoor localization system, and with the help of it i plan the start and end point of the path. But thanks to the turbulance and the noise of the drone, the provided data of the localization system is very noisy during the "travelling".

So i figured out that i have to transform the indoor localization data (start coordinate, end coordinate, orientation) into the drone's coordinate system. But this is not clear to me how this inner coordinate system works. And i could not find any info about it either ... yet.

So i hope you could help me! This is very important, because it is the last step of my work to finish it! My mathematic model is perfect, and i coded it properly. But the perception of "did the drone reached the destination" doesn't perfect.

I'm using Indigo Igloo in Ubuntu 14.04 virtual machine! And ardrone_autonomy as well.

Thank you in advance! Yours, Steve

click to hide/show revision 2
None

ardrone2.0s coordinate system

THIS IS VERY URGENT ISSUE, PLS HELP ME IMMEDIATELY IF YOU CAN! THANKS IN ADVANCE!!! Hy community!

I am writing my thesis at the university about Localization and navigation of ARDrone 2.0. I have read all of the available topics and documentations for the ros support of ardrone, but i have a problem which i could not solve yet.

I know, that i can get the position data from ardrone's /ardrone/odometry topic. And i know that too, that if i get an msg object on that topic, the position data is in the msg->pose.pose.position field!

But can somebody explain to me HOW THIS POSITION DATA IS DETERMINED??? How does the drone's coordinate system works? HOw can i get this coordinate system?

This is important to me, because i have an indoor localization system, and with the help of it i plan the start and end point of the path. But thanks to the turbulance and the noise of the drone, the provided data of the localization system is very noisy during the "travelling".

So i figured out that i have to transform the indoor localization data (start coordinate, end coordinate, orientation) into the drone's coordinate system. But this is not clear to me how this inner coordinate system works. And i could not find any info about it either ... yet.

So i hope you could help me! This is very important, because it is the last step of my work to finish it! My mathematic model is perfect, and i coded it properly. But the perception of "did the drone reached the destination" doesn't perfect.

I'm using Indigo Igloo in Ubuntu 14.04 virtual machine! And ardrone_autonomy as well.

Thank you in advance! Yours, Steve

click to hide/show revision 3
None

ardrone2.0s coordinate system

Hy community!

I am writing my thesis at the university about Localization and navigation of ARDrone 2.0. I have read all of the available topics and documentations for the ros support of ardrone, but i have a problem which i could not solve yet.

I know, that i can get the position data from ardrone's /ardrone/odometry topic. And i know that too, that if i get an msg object on that topic, the position data is in the msg->pose.pose.position field!

But can somebody explain to me HOW THIS POSITION DATA IS DETERMINED??? How does the drone's coordinate system works? HOw can i get this coordinate system?

This is important to me, because i have an indoor localization system, and with the help of it i plan the start and end point of the path. But thanks to the turbulance and the noise of the drone, the provided data of the localization system is very noisy during the "travelling".

So i figured out that i have to transform the indoor localization data (start coordinate, end coordinate, orientation) into the drone's coordinate system. But this is not clear to me how this inner coordinate system works. And i could not find any info about it either ... yet.

So i hope you could help me! This is very important, because it is the last step of my work to finish it! My mathematic model is perfect, and i coded it properly. But the perception of "did the drone reached the destination" doesn't perfect.

I'm using Indigo Igloo in Ubuntu 14.04 virtual machine! And ardrone_autonomy as well.

Thank you in advance! Yours, Steve


Update:

Thank you for answering this fast lads! I appreciate it!

So is there any way to "visualize" / draw up any kind of coordinate system to my calculations? I would like to do some kind of "coordinate transformation" on my own in order to get the destination coordinate in the Drone's coordinate system.

The aim of this calculation is to supervise the drone's travelling to the destination point: if the drone not turned enough and started to travel to a wrong direction, i could shut down the process automatically. OR if the drone has arrived to the destination's environment (for example 30 cm distance from that point), i could send a signal to the drone which tells it to STOP, YOU JUST ARRIVED TO YOUR DESTINATION!

ardrone2.0s coordinate system

Hy community!

I am writing my thesis at the university about Localization and navigation of ARDrone 2.0. I have read all of the available topics and documentations for the ros support of ardrone, but i have a problem which i could not solve yet.

I know, that i can get the position data from ardrone's /ardrone/odometry topic. And i know that too, that if i get an msg object on that topic, the position data is in the msg->pose.pose.position field!

But can somebody explain to me HOW THIS POSITION DATA IS DETERMINED??? How does the drone's coordinate system works? HOw can i get this coordinate system?

This is important to me, because i have an indoor localization system, and with the help of it i plan the start and end point of the path. But thanks to the turbulance and the noise of the drone, the provided data of the localization system is very noisy during the "travelling".

So i figured out that i have to transform the indoor localization data (start coordinate, end coordinate, orientation) into the drone's coordinate system. But this is not clear to me how this inner coordinate system works. And i could not find any info about it either ... yet.

So i hope you could help me! This is very important, because it is the last step of my work to finish it! My mathematic model is perfect, and i coded it properly. But the perception of "did the drone reached the destination" doesn't perfect.

I'm using Indigo Igloo in Ubuntu 14.04 virtual machine! And ardrone_autonomy as well.

Thank you in advance! Yours, Steve


Update:

Thank you for answering this fast lads! I appreciate it!

So is there any way to "visualize" / draw up any kind of coordinate system to my calculations? I would like to do some kind of "coordinate transformation" on my own in order to get the destination coordinate in the Drone's coordinate system.

The aim of this calculation is to supervise the drone's travelling to the destination point: if the drone not turned enough and started to travel to a wrong direction, i could shut down the process automatically. OR if the drone has arrived to the destination's environment (for example 30 cm distance from that point), i could send a signal to the drone which tells it to STOP, YOU JUST ARRIVED TO YOUR DESTINATION!


Update2

I am so sorry that ARDrone autonomy's documentation is so imperfect. I mean the coordinate frames chapter is only 1 page long... And there is not any concrete documentation of how drone frames is related to each other, how or when odom tf is set up... Why do i have to take the time to figure out such important things like that???

Why can't be just one man or a team who write the documentations RESPONSIBLY? It is very annoying!!!