Ask Your Question

Revision history [back]

Are odom, pose, twist and imu processed differently in robot_localization?

I wonder, why robot_localization distinguishes between the four types of measurements odom, pose, twist and imu. Is it only for convenience, because these are four common message types in ROS? Or are they treated differently?

I quickly created an overview of the different message types with the data they're containing:

                                     Linear acceleration ───────────────┐
                                     Angular velocity ─────────────┐    │
                                     Linear velocity ─────────┐    │    │
                                     Orientation ────────┐    │    │    │
                                     Position ──────┐    │    │    │    │
                                                    │    │    │    │    │
    ────────────────────────────────────────────────┼────┼────┼────┼────┼──
                                                    │    │    │    │    │
    Odometry                                        │    │    │    │    │
                                                    │    │    │    │    │
        std_msgs/Header header                                         
        string child_frame_id                                          
        geometry_msgs/PoseWithCovariance pose       ×    ×             
        geometry_msgs/TwistWithCovariance twist               ×    ×   

    ────────────────────────────────────────────────┼────┼────┼────┼────┼──
                                                    │    │    │    │    │
    PoseStamped                                     │    │    │    │    │
                                                    │    │    │    │    │                                                                         
        std_msgs/Header header                                         
        geometry_msgs/Pose pose                     ·    ·             

    ────────────────────────────────────────────────┼────┼────┼────┼────┼──
                                                    │    │    │    │    │
    TwistStamped                                    │    │    │    │    │
                                                    │    │    │    │    │                                                                         
        std_msgs/Header header
        geometry_msgs/Twist twist                             ·    ·   

    ────────────────────────────────────────────────┼────┼────┼────┼────┼──
                                                    │    │    │    │    │
    Imu                                             │    │    │    │    │
                                                    │    │    │    │    │                                                                         
        std_msgs/Header header                                         
        geometry_msgs/Quaternion orientation             ×             
        float64[9] orientation_covariance                              
        geometry_msgs/Vector3 angular_velocity                     ×   
        float64[9] angular_velocity_covariance                         
        geometry_msgs/Vector3 linear_acceleration                       ×
        float64[9] linear_acceleration_covariance                      

    ────────────────────────────────────────────────┴────┴────┴────┴────┴──

    Symbols:
        × with covariance
        · without covariance

If I understand correctly, I could integrate orientation measurements from, e.g., a compass as odometry (orientation part of the Pose), as IMU data (orientation) or - if I don't care about the covariance - as pose (orientation part). In the launch file I select the yaw and all other measurements will be ignored anyway.

Would the result be equal, or is there anything I overlooked?

Are odom, pose, twist and imu processed differently in robot_localization?

I wonder, why robot_localization distinguishes between the four types of measurements odom, pose, twist and imu. Is it only for convenience, because these are four common message types in ROS? Or are they treated differently?

I quickly created an overview of the different message types with the data they're containing:

                                     Linear acceleration ───────────────┐
                                     Angular velocity ─────────────┐    │
                                     Linear velocity ─────────┐    │    │
                                     Orientation ────────┐    │    │    │
                                     Position ──────┐    │    │    │    │
                                                    │    │    │    │    │
    ────────────────────────────────────────────────┼────┼────┼────┼────┼──
                                                    │    │    │    │    │
    Odometry                                        │    │    │    │    │
                                                    │    │    │    │    │
        std_msgs/Header header                                         
        string child_frame_id                                          
        geometry_msgs/PoseWithCovariance pose       ×    ×             
        geometry_msgs/TwistWithCovariance twist               ×    ×   

    ────────────────────────────────────────────────┼────┼────┼────┼────┼──
                                                    │    │    │    │    │
    PoseStamped               PoseWithCovarianceStamped                       │    │    │    │    │
                                                    │    │    │    │    │                                                                         
        std_msgs/Header header                                         
        geometry_msgs/Pose geometry_msgs/PoseWithCovariance pose                     ·    · ×    ×             

    ────────────────────────────────────────────────┼────┼────┼────┼────┼──
                                                    │    │    │    │    │
    TwistStamped               TwistWithCovarianceStamped                      │    │    │    │    │
                                                    │    │    │    │    │                                                                         
        std_msgs/Header header
        geometry_msgs/Twist geometry_msgs/TwistWithCovariance twist                             ·    · ×    ×   

    ────────────────────────────────────────────────┼────┼────┼────┼────┼──
                                                    │    │    │    │    │
    Imu                                             │    │    │    │    │
                                                    │    │    │    │    │                                                                         
        std_msgs/Header header                                         
        geometry_msgs/Quaternion orientation             ×             
        float64[9] orientation_covariance                              
        geometry_msgs/Vector3 angular_velocity                     ×   
        float64[9] angular_velocity_covariance                         
        geometry_msgs/Vector3 linear_acceleration                       ×
        float64[9] linear_acceleration_covariance                      

    ────────────────────────────────────────────────┴────┴────┴────┴────┴──

    Symbols:
        × with covariance
        · without covariance

If I understand correctly, I could integrate orientation measurements from, e.g., a compass as odometry (orientation part of the Pose), as IMU data (orientation) or - if I don't care about the covariance - as pose (orientation part). In the launch file I select the yaw and all other measurements will be ignored anyway.

Would the result be equal, or is there anything I overlooked?