gps/imu and sensor data synchronization and transformations?
Hello,
I have GPS/IMU information read by the ROS driver for a DJI drone (A3 - API) and I am using this information (sensor_msgs/Imu and sensor_msgs/NavSatFix) to transform other sensor data on the robot to the parent frame of the drone as it moves.
- To make sure I am moving in the right direction, my first question would have to be this - should the gps frame be the parent frame (UTM converted values) for all sensor information (eg: camera) or would it have to be the drone's body frame?
- If I write two publisher nodes that is sending out gps and imu transforms separately to the same frame, are the data time synchronized in ros time by the DJI SDK ROS driver? (If this makes sense)
- How do I use a UTM object as a x,y,z point in this particular context?
- When the transforms are done (let's say I am able to visualize the live pointclouds moving with the drone on RVIZ), if I store the XYZI pointclouds when the transforms are being sent out and view it together later on pcl_viewer, will these displacements be written to the pointclouds and I will be able to visualize a "map" of the area covered by the drone?
I did extensively read a lot of questions here and ros.wiki documentations, but I just wanted to directly clarify my questions because I am little confused being considerably new to the advanced functions in ROS. If my thinking is wrong, please do correct me and I really appreciate any help or guidance in this regard, thanks :)
- Sneha