imu orientation callback of robot_localization and mahalanobis distance formula

asked 2018-09-05 04:34:47 -0500

jxl gravatar image

Hello,

In r_l package, imu topic callback is defined in here. For imu orientation, it calls poseCallback(pptr, poseCallbackData, baseLinkFrameId_, true);and then it calls preparePose()func with params: targetFrame="base_link", imuData=true.

Let's look at the preparePose() func carefully. Because IMU maybe have a magnetometer or not(commonly it has a gyro and acc), with imu differential param and relative param set differently, we can conclude 8 cases, let us focuse on the 4 cases when differential=false, just like this picture. Following ROS Conventions for IMU Sensor Drivers, when there is not a magnetometer, the imu's world_frame is the initial time of imu itself. Generally, on initial time imu is roll=0, pitch=0, yaw=unknown.

if (imuData)
  {
    // First, convert the transform and measurement rotation to RPY
    // @todo: There must be a way to handle this with quaternions. Need to look into it.
    double rollOffset = 0;
    double pitchOffset = 0;
    double yawOffset = 0;
    double roll = 0;
    double pitch = 0;
    double yaw = 0;
    RosFilterUtilities::quatToRPY(targetFrameTrans.getRotation(), rollOffset, pitchOffset, yawOffset);
    RosFilterUtilities::quatToRPY(poseTmp.getRotation(), roll, pitch, yaw);

    // 6b. Apply the offset (making sure to bound them), and throw them in a vector
    tf2::Vector3 rpyAngles(FilterUtilities::clampRotation(roll - rollOffset),
                           FilterUtilities::clampRotation(pitch - pitchOffset),
                           FilterUtilities::clampRotation(yaw - yawOffset));

    // 6c. Now we need to rotate the roll and pitch by the yaw offset value.
    // Imagine a case where an IMU is mounted facing sideways. In that case
    // pitch for the IMU's world frame is roll for the robot.
    tf2::Matrix3x3 mat;
    mat.setRPY(0.0, 0.0, yawOffset);
    rpyAngles = mat * rpyAngles;
    poseTmp.getBasis().setRPY(rpyAngles.getX(), rpyAngles.getY(), rpyAngles.getZ());

    // We will use this target transformation later on, but
    // we've already transformed this data as if the IMU
    // were mounted neutrall on the robot, so we can just
    // make the transform the identity.
    targetFrameTrans.setIdentity();
  }

It seems make sense that poseTmp's orientation finally represents: time=msg.header.stamp, the base_link in odom_frame orientation with imu orientation info estimated. But If the IMU has a magnetometer, what is the meaning of rpyAngles = mat * rpyAngles? We know that imu's orientation is from imu_world_frame to imu_link. Before multiplying rpyAngles with mat, rpyAngles is: time=msg.header.stamp, the base_link in imu_world_frame orientation, mat is : time=msg.header.stamp, imu_link in base_link orientation, on this time, i want to know what is the meaning of rpyAngles = mat * rpyAngles;, can they multiply, or something i misunderstanding?

Q2:why initialize the estimateErrorCovariance_ very small, should not with large values, in here?

//Set the estimate error covariance. We want our measurements
// to be accepted rapidly when the filter starts, so we should
// initialize the state's covariance with large values.
estimateErrorCovariance_.setIdentity();
estimateErrorCovariance_ *= 1e-9;

Q3: When Check Mahalanobis distance between mapped measurement and state in ekf's correct func, i think a measurement z is a normal distribution, N(H*xk^, H* pk^ * H') without measurement noise, where xk^ is the predicted state, and pk^ is the corresponding covariance. Why in the code ... (more)

edit retag flag offensive close merge delete