ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
1 | initial version |
On board with all of the above.
Regarding your comment: many/most IMUs have a reference orientation shown on the device (e.g. http://www.ecnmag.com/sites/ecnmag.com/files/USR1001507_134852_MicroStrain3DMGX335InertialSensor_0.png). I think it would make sense to output the orientation in terms of this frame in relation to the world frame. Correspondingly, we should output the angular_velocity and acceleration in terms of this frame as well.
Counterintuitively, the um6 driver manipulates the data internally (https://github.com/ros-drivers/um6/blob/indigo-devel/src/main.cpp#L205), with the goal of providing a more intuitive IMU orientation. I think this kind of hidden-transformation adds confusion instead, since the data no longer matches the image on the device, and it makes sense to leave setting up a proper base_link->imu_link transform to whoever is integrating the IMU on the robot.
2 | No.2 Revision |
On board with all of the above.
Regarding your comment: many/most IMUs have a reference orientation shown on the device (e.g. http://www.ecnmag.com/sites/ecnmag.com/files/USR1001507_134852_MicroStrain3DMGX335InertialSensor_0.png).
I think it would make sense to output the orientation in terms of this the 'orientation of the reference frame in relation to the world frame. frame'. Correspondingly, we should output the angular_velocity and acceleration in terms of this frame as well.
Counterintuitively, the um6 driver manipulates the data internally (https://github.com/ros-drivers/um6/blob/indigo-devel/src/main.cpp#L205), with the goal of providing a more intuitive IMU orientation. I think this kind of hidden-transformation adds confusion instead, since the data no longer matches the image on the device, and it makes sense to leave setting up a proper base_link->imu_link transform to whoever is integrating the IMU on the robot.
3 | No.3 Revision |
On board with all of the above.
Regarding your comment: many/most IMUs have a reference orientation shown on the device (e.g. http://www.ecnmag.com/sites/ecnmag.com/files/USR1001507_134852_MicroStrain3DMGX335InertialSensor_0.png).
I think it would make sense to output the orientation in terms of the 'orientation of the reference frame in the world frame'. Correspondingly, we should output the angular_velocity and acceleration in terms of this reference frame as well.
Counterintuitively, the um6 driver manipulates the data internally (https://github.com/ros-drivers/um6/blob/indigo-devel/src/main.cpp#L205), with the goal of providing a more intuitive IMU orientation. orientation (ENU instead of NED). I think this kind of hidden-transformation adds confusion instead, since the data no longer matches the image on the device, and it makes sense to leave setting up a proper base_link->imu_link transform to whoever is integrating the IMU on the robot.robot.
The only time I think it would be appropriate to do an internal transform would be to convert from left-handed to right-handed.
4 | No.4 Revision |
On board with all of the above.
Regarding your comment: many/most IMUs have a reference orientation shown on the device (e.g. http://www.ecnmag.com/sites/ecnmag.com/files/USR1001507_134852_MicroStrain3DMGX335InertialSensor_0.png). I think it would make sense to output the orientation in terms of the 'orientation of the reference frame in the world frame'. Correspondingly, we should output the angular_velocity and acceleration in this reference frame as well.
Counterintuitively, the um6 driver manipulates the data internally (https://github.com/ros-drivers/um6/blob/indigo-devel/src/main.cpp#L205), with the goal of providing a more intuitive IMU orientation (ENU instead of NED). I think this kind of hidden-transformation adds confusion instead, since instead - the data frame no longer matches the reference image on the device, and device
I think it makes sense to leave setting up a proper base_link->imu_link transform to whoever is integrating the IMU on the robot.
The only time I think it would be appropriate to do an internal a driver-side transform would be to convert from left-handed to right-handed.
5 | No.5 Revision |
On board with all of the above.
Regarding your comment: many/most IMUs have a reference orientation shown on the device (e.g. http://www.ecnmag.com/sites/ecnmag.com/files/USR1001507_134852_MicroStrain3DMGX335InertialSensor_0.png). I think it would make sense to output the orientation in terms of the 'orientation of the reference frame in the world frame'. Correspondingly, we should output the angular_velocity and acceleration in this reference frame as well.
Counterintuitively, the um6 driver manipulates the data internally (https://github.com/ros-drivers/um6/blob/indigo-devel/src/main.cpp#L205), with the goal of providing a more intuitive IMU orientation (ENU instead of NED). I think this kind of hidden-transformation adds confusion instead - the data frame no longer matches the reference image on the devicedevice.
I think it makes sense to leave setting up a proper base_link->imu_link transform to whoever is integrating the IMU on the robot. The only time it would be appropriate to do a driver-side transform would be to convert from left-handed to right-handed.
--- EDIT
I think a REP (amendment) would go a long way to reinforcing standards for IMU output, particularly regarding orientation data in the world-fixed and rotation/acceleration data in the device's reference frame, the adherence to right-handedness in the rotations, and the orientation of the device's acceleration vector due to gravity (up not down).