ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring it.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring it. the measurement.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring the measurement.

EDIT 2: That is strange. I'll check it out and let you know what I find.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring the measurement.

EDIT 2: That is strange. I'll check it out and let you know what I find.

EDIT 3: Figured out the issue. Your pose2 topic has an incorrect covariance. You're measuring Y, so the covariance value should be non-zero in the (Y, Y) position, which, in a pose message, is position 7 in the array, not position 2 (position 2 is the covariance between X and Y). As your pose2 data is being reported in a different frame, when the covariance gets rotated into the odom frame, the final measurement covariance matrix that gets fed into the filter is broken. This eventually breaks the covariance for the entire state, and the filter locks up, presumably when trying to invert a singular matrix.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring the measurement.

EDIT 2: That is strange. I'll check it out and let you know what I find.

EDIT 3: Figured out the issue. Your pose2 topic has an incorrect covariance. You're measuring Y, so the covariance value should be non-zero in the (Y, Y) position, which, in a pose message, is position 7 in the array, not position 2 (position 2 is the covariance between X and Y). As your pose2 data is being reported in a different frame, when the covariance gets rotated into the odom frame, the final measurement covariance matrix that gets fed into the filter is broken. This eventually breaks the covariance for the entire state, and the filter locks up, presumably when trying to invert a singular matrix.

EDIT 4: Yeah, it's a covariance issue. Your yaw variance is ~9 degrees. That won't play well with the differential setting, at least as it is now. It's partially a bug, and partially an issue with unnecessary use of the differential setting. If you have multiple measurement sources for one variable, it's usually best to let one of them be integrated normally (i.e., with differential set to false), and let the others be handled differentially. This is more important for orientation data than for pose data. As of the next release, all differential measurements will be converted to velocities, and the covariances will be handled in a more numerically correct manner, but it will also mean that users should really only use the differential setting when necessary.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring the measurement.

EDIT 2: That is strange. I'll check it out and let you know what I find.

EDIT 3: Figured out the issue. Your pose2 topic has an incorrect covariance. You're measuring Y, so the covariance value should be non-zero in the (Y, Y) position, which, in a pose message, is position 7 in the array, not position 2 (position 2 is the covariance between X and Y). As your pose2 data is being reported in a different frame, when the covariance gets rotated into the odom frame, the final measurement covariance matrix that gets fed into the filter is broken. This eventually breaks the covariance for the entire state, and the filter locks up, presumably when trying to invert a singular matrix.

EDIT 4: Yeah, it's a covariance issue. Your yaw variance is ~9 degrees. That won't play well with the differential setting, at least as it is now. It's partially a bug, and partially an issue with unnecessary use of the differential setting. If you have multiple measurement sources for one variable, it's usually best to let one of them be integrated normally (i.e., with differential set to false), and let the others be handled differentially. This is more important for orientation data than for pose data. As of the next release, all differential measurements will be converted to velocities, and the covariances will be handled in a more numerically correct manner, but it will also mean that users should really only use the differential setting when necessary.

EDIT 5: I'd have to see an updated bag file, but if I had to guess, what you're seeing is the inevitable result of variables being correlated in the state transition function and the Jacobian matrix. Z is correlated with Z velocity, and Z velocity is correlated with X and Y position (due to your orientation). The real issue (again, without seeing a bag file) is that for whatever reason, the covariances between your variables are growing to a point where a measurement in one will affect the other. One recommendation I would make in general would be to only use the differential setting when it's absolutely necessary. For example, once I merge my current working branch back into the -devel branches, using the differential setting on absolute orientation data will not be a good idea (unless you have more than one source of that information, and one of them is fused absolutely). In any case, I would advise you to post an updated bag file with all of the data sources running except ekf_localization_node, and then drive the robot around a bit, and then let me know what exactly you did.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring the measurement.

EDIT 2: That is strange. I'll check it out and let you know what I find.

EDIT 3: Figured out the issue. Your pose2 topic has an incorrect covariance. You're measuring Y, so the covariance value should be non-zero in the (Y, Y) position, which, in a pose message, is position 7 in the array, not position 2 (position 2 is the covariance between X and Y). As your pose2 data is being reported in a different frame, when the covariance gets rotated into the odom frame, the final measurement covariance matrix that gets fed into the filter is broken. This eventually breaks the covariance for the entire state, and the filter locks up, presumably when trying to invert a singular matrix.

EDIT 4: Yeah, it's a covariance issue. Your yaw variance is ~9 degrees. That won't play well with the differential setting, at least as it is now. It's partially a bug, and partially an issue with unnecessary use of the differential setting. If you have multiple measurement sources for one variable, it's usually best to let one of them be integrated normally (i.e., with differential set to false), and let the others be handled differentially. This is more important for orientation data than for pose data. As of the next release, all differential measurements will be converted to velocities, and the covariances will be handled in a more numerically correct manner, but it will also mean that users should really only use the differential setting when necessary.

EDIT 5: I'd have to see an updated bag file, but if I had to guess, what you're seeing is the inevitable result of variables being correlated in the state transition function and the Jacobian matrix. matrix [UPDATE: I no longer think this is true. See edit 6 below]. Z is correlated with Z velocity, and Z velocity is correlated with X and Y position (due to your orientation). The real issue (again, without seeing a bag file) is that for whatever reason, the covariances between your variables are growing to a point where a measurement in one will affect the other. One recommendation I would make in general would be to only use the differential setting when it's absolutely necessary. For example, once I merge my current working branch back into the -devel branches, using the differential setting on absolute orientation data will not be a good idea (unless you have more than one source of that information, and one of them is fused absolutely). In any case, I would advise you to post an updated bag file with all of the data sources running except ekf_localization_node, and then drive the robot around a bit, and then let me know what exactly you did.

EDIT 6: After looking at your data, there are a couple things that I noticed:

  • First, your svs/depth data is being reported in the SVS frame. I ran tf_echo from odom->SVS, and here's a snapshot of what I saw:

    At time 1423618656.761
    - Translation: [-0.575, -0.277, 0.033]
    - Rotation: in Quaternion [-0.015, -0.101, 0.946, 0.308]
                in RPY [-0.202, -0.034, 2.515]
    At time 1423618657.766
    - Translation: [-0.575, -0.277, 0.033]
    - Rotation: in Quaternion [-0.015, -0.102, 0.946, 0.308]
                in RPY [-0.203, -0.035, 2.515]
    At time 1423618658.769
    - Translation: [-0.575, -0.277, 0.033]
    - Rotation: in Quaternion [-0.015, -0.103, 0.946, 0.308]
                in RPY [-0.205, -0.035, 2.516]
    At time 1423618659.741
    - Translation: [-0.575, -0.277, 0.033]
    - Rotation: in Quaternion [-0.015, -0.103, 0.946, 0.308]
                in RPY [-0.206, -0.035, 2.516]
    

    The issue here is that you have a rotation in that transform (and it appears to be moving). The roll and pitch angles are non-zero, which is why you see X and Y motion when all you do is integrate the depth data. Where is this sensor mounted on your vehicle?

  • As I noted in the comment for your question, your raw sonar data did have a jump from 4 meters to 2 meters when you surfaced. Not sure if you were trying to determine if that was a result of the raw data or ekf_localization_node.

Perhaps I can learn a bit more about your setup. Is the robot always at a fixed orientation with respect to the sides of the pool, or can its heading vary? Where is the (0, 0) point in your pool world frame? Perhaps a diagram would be good to place in your original question.

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring the measurement.

EDIT 2: That is strange. I'll check it out and let you know what I find.

EDIT 3: Figured out the issue. Your pose2 topic has an incorrect covariance. You're measuring Y, so the covariance value should be non-zero in the (Y, Y) position, which, in a pose message, is position 7 in the array, not position 2 (position 2 is the covariance between X and Y). As your pose2 data is being reported in a different frame, when the covariance gets rotated into the odom frame, the final measurement covariance matrix that gets fed into the filter is broken. This eventually breaks the covariance for the entire state, and the filter locks up, presumably when trying to invert a singular matrix.

EDIT 4: Yeah, it's a covariance issue. Your yaw variance is ~9 degrees. That won't play well with the differential setting, at least as it is now. It's partially a bug, and partially an issue with unnecessary use of the differential setting. If you have multiple measurement sources for one variable, it's usually best to let one of them be integrated normally (i.e., with differential set to false), and let the others be handled differentially. This is more important for orientation data than for pose data. As of the next release, all differential measurements will be converted to velocities, and the covariances will be handled in a more numerically correct manner, but it will also mean that users should really only use the differential setting when necessary.

EDIT 5: I'd have to see an updated bag file, but if I had to guess, what you're seeing is the inevitable result of variables being correlated in the state transition function and the Jacobian matrix [UPDATE: I no longer think this is true. See edit 6 below]. Z is correlated with Z velocity, and Z velocity is correlated with X and Y position (due to your orientation). The real issue (again, without seeing a bag file) is that for whatever reason, the covariances between your variables are growing to a point where a measurement in one will affect the other. One recommendation I would make in general would be to only use the differential setting when it's absolutely necessary. For example, once I merge my current working branch back into the -devel branches, using the differential setting on absolute orientation data will not be a good idea (unless you have more than one source of that information, and one of them is fused absolutely). In any case, I would advise you to post an updated bag file with all of the data sources running except ekf_localization_node, and then drive the robot around a bit, and then let me know what exactly you did.

EDIT 6: After looking at your data, there are a couple things that I noticed:

  • First, your svs/depth data is being reported in the SVS frame. I ran tf_echo from odom->SVS, and here's a snapshot of what I saw:

    At time 1423618656.761
    - Translation: [-0.575, -0.277, 0.033]
    - Rotation: in Quaternion [-0.015, -0.101, 0.946, 0.308]
                in RPY [-0.202, -0.034, 2.515]
    At time 1423618657.766
    - Translation: [-0.575, -0.277, 0.033]
    - Rotation: in Quaternion [-0.015, -0.102, 0.946, 0.308]
                in RPY [-0.203, -0.035, 2.515]
    At time 1423618658.769
    - Translation: [-0.575, -0.277, 0.033]
    - Rotation: in Quaternion [-0.015, -0.103, 0.946, 0.308]
                in RPY [-0.205, -0.035, 2.516]
    At time 1423618659.741
    - Translation: [-0.575, -0.277, 0.033]
    - Rotation: in Quaternion [-0.015, -0.103, 0.946, 0.308]
                in RPY [-0.206, -0.035, 2.516]
    

    The issue here is that you have a rotation in that transform (and it appears to be moving). The roll and pitch angles are non-zero, which is why you see X and Y motion when all you do is integrate the depth data. Where is this sensor mounted on your vehicle?

  • As I noted in the comment for your question, your raw sonar data did have a jump from 4 meters to 2 meters when you surfaced. Not sure if you were trying to determine if that was a result of the raw data or ekf_localization_node.

Perhaps I can learn a bit more about your setup. Is the robot always at a fixed orientation with respect to the sides of the pool, or can its heading vary? Where is the (0, 0) point in your pool world frame? Perhaps a diagram would be good to place in your original question.

EDIT 7: Whew, nice diagram. To answer your question, the filter itself doesn't care if you fuse them separately. It's built to let you measure as many variables in each measurement as you want, and they don't need to be synchronized in any way. However, I think what you're getting at is that there are implications to independently rotating each component, because what you really want to do is rotate the coordinate in the pool frame so that it's in the odom frame.

In any case, I want to make sure of a couple things.

  • As your diagram is, I would expect your initial position in the pool frame to be approximately (-4, -2). Is that correct?
  • If you turned off differential integration for your IMU data, then I would expect your odom frame's X-axis to align with whatever your IMU's zero yaw position is (likely magnetic north). I think this is what you've shown. However, I don't see why your odom frame's origin is the same as your pool frame. I would expect your origin in the odom frame to match your robot's start position. In other words, when you start, your position in the odom frame would be (0, 0) and with a heading of whatever your IMU is currently reading.

What you're effectively doing is taking range measurements and trying to use them as absolute position information. If I were you, I would do something like this:

  1. Define transforms from base_link->sonar1 and base_link->sonar2. I am considering base_link to be rigidly affixed to your vehicle's origin. One of the transforms will have just have an +X translation, and the other (it appears) will have a -X translation, and a rotation of +90 degrees yaw.
  2. Now, define a virtual sensor that takes in the raw range measurements, and then applies those transforms. You should now have the two ranges in your base_link frame, and one will be +X (from the long edge sonar), and the other +Y. You can easily test to make sure that the transformed values make sense.
  3. Now, in the event that your robot is not perfectly aligned with the pool, you need to correct those ranges to get the "true" (straight-line) ranges to the edges.
  4. At this point, you have the two straight-line distances from your vehicle's origin to the pool edges. You need to negate both of them.
  5. Now, apply your static pool-to-north rotation.
  6. Set the measurement's frame to odom
  7. Feed it to ekf_localization_node, and do it differentially.

I haven't thought through this in too much detail, but I think it's close to what you want. What you're attempting may work as well, though. Good luck!