Ask Your Question

Fusing 2 sonars / pingers as continous sensors or 'GPS'

asked 2015-01-16 10:01:07 -0600

updated 2015-02-13 15:40:26 -0600

Picking up where I left ekf_localization-node-not-responding, I am trying to fuse position in x and y from 2 sonars (Tritech Micron MK III) with (double integrated accelerometer) position from an IMU (xsens MTi 30).

This happens inside a rectangular pool, so each sonar

  • will track a side wall and measure the absolute distance to it (one sonar looking at the short wall, the other at the long wall),
    • we have a distance and a heading attached to it
  • and report x and/or y position either to use as
    • absolute position with x axis || long side of pool, y || short side of pool
    • or relative position to the starting point (differential_integration == true)

(The point being trying to get a quicker update rate compared to doing machine vision on a full scan)

The problem now is: How can I account for the sonar's offset from base_link since the measurements aren't really in the body frame?

I could either

  1. report the data with frame_id odom or map, but how could I account for the mounting offset especially when the sub is not aligned with the side walls.
    1. would that be possible with a single instance of robot_localization, or should then tread them as 'GPS'?
  2. report the data in a Sonar1 and Sonar2 frame and use the current heading of the sonar head (plus the fixed translation) to publish a transform from Sonar1 to base_link
    • This means that I will have to use the attitude solution to track the wall, but that's outside the scope of robot_localization.

I believe the latter is the better approach. Does this make sense or how could it be improved?



Wrap up on how to fuse the sonars

I am still working through implementing some of these steps, but I will document the process here.

What are the frames and transforms needed for fusing the sonars:

  • Frames:
    • odom
    • pool
    • sonar1 (will be fused for x only)
    • sonar2 (will be fused for y only)
    • base_link
    • imu
  • Transforms:
    • odom->base_link (generated by ekf_localization)
    • odom->pool (static transform, accounting for rotational offset of pool walls/axes with ENU)
    • pool->sonar1 (broadcasts a translation of the sonar mounting position relative to base_link rotated/trigonometried by the yaw angle of pool->base_link transform)
    • pool->sonar2
    • base_link->imu
  • Data:
    • /sonar/position/x in frame sonar1
    • /sonar/position/y in frame sonar2
    • /imu/data in frame imu
    • /svs/depth in frame svs


EDIT (regarding 1st edit in Answer):

I have set it up the way you recommended and I publish a transform from odom-->sonar every time I publish a new sonar reading (just before). The rotational part of that transform uses the current attitude as published by the robot_localization node.

Robot_localization sees the following input at the moment:

  • attitude from the IMU (imu/data)
  • position from the sonars (sonar/position0 and sonar/position1)

However, after a few seconds the robot_localization node slows down to about 1 update every 5 seconds... EDIT: By 'slows down' I mean ...

edit retag flag offensive close merge delete


Any chance you can record a bag file without running robot_localization? I need the tf tree to not have the odom->base_link transform in it.

Tom Moore gravatar image Tom Moore  ( 2015-01-20 09:35:12 -0600 )edit

No problem: 2015-01-20-19-10-36.bag

Most of the dynamic and static transforms (intentionally) have got a translation of (0,0,0) for the time being

Raphael Nagel gravatar image Raphael Nagel  ( 2015-01-20 13:16:58 -0600 )edit

Re: edit 4, it sounds like it's going to be a covariance issue. I'll check it out. Any reason you're using the differential setting?

Tom Moore gravatar image Tom Moore  ( 2015-02-02 16:17:07 -0600 )edit

Sorry, when you say that X should not have changed when surfacing, do you mean the raw data should not have changed, or that you think the EKF jumped? In the bag file, the /sonar/positionBMT topic changes from ~4 meters to ~2 meters at around 34 seconds into the bag file.

Tom Moore gravatar image Tom Moore  ( 2015-02-11 15:36:28 -0600 )edit

Sorry, that was a little unclear. It is a sonar processing artifact.The sonar has a vertical beamwidth of 30° and unless the robot is perfectly horizontal it will get a reflection of the water surface if the sub is not fully submerged.

Raphael Nagel gravatar image Raphael Nagel  ( 2015-02-12 03:43:14 -0600 )edit

1 Answer

Sort by » oldest newest most voted

answered 2015-01-16 15:44:34 -0600

Tom Moore gravatar image

updated 2015-02-13 16:45:18 -0600

This is an issue that I have been thinking about for navsat_transform_node. In your case, one option would be to define a transform from odom to sonar that is actually the offset from base_link, but rotated by the robot's current heading. This means the transform could not come from a static transform, but would have to originate from some other node that manages the rotation of the offset and the transform broadcast. Then you can simply report each sonar position measurement in the sonar frame, and ekf_localization_node will transform it before fusing it.

EDIT in response to comments: ekf_localization_node will try to transform all absolute pose data into the frame specified by world_frame parameter. If you make the sonar transform relative to base_link, it will transform the position from sonar to base_link, and then apply the transform to the world_frame. For example, if your robot is at (10, 5) and your sonar is mounted at (1, 1) with respect to the base_link frame, and you get a sonar-based position estimate of (12, 7), then your measurement might get transformed to, say (11, 6) (after transforming to base_link), and then to (21, 11) when the transform to the world_frame is applied. '

EDIT in response to question edit: interesting. I've not seen slowdown occur before. Do you mean the message broadcast rate slows down (rostopic hz /odometry/filtered), or that it only produces an updated state at a slower rate? When I get a chance, I'll check out the bag file. My first guess is that a required transform is failing most of the time, so ekf_localization_node is ignoring the measurement.

EDIT 2: That is strange. I'll check it out and let you know what I find.

EDIT 3: Figured out the issue. Your pose2 topic has an incorrect covariance. You're measuring Y, so the covariance value should be non-zero in the (Y, Y) position, which, in a pose message, is position 7 in the array, not position 2 (position 2 is the covariance between X and Y). As your pose2 data is being reported in a different frame, when the covariance gets rotated into the odom frame, the final measurement covariance matrix that gets fed into the filter is broken. This eventually breaks the covariance for the entire state, and the filter locks up, presumably when trying to invert a singular matrix.

EDIT 4: Yeah, it's a covariance issue. Your yaw variance is ~9 degrees. That won't play well with the differential setting, at least as it is now. It's partially a bug, and partially an issue with unnecessary use of the differential setting. If you have multiple measurement sources for one variable, it's usually best to let one of them be integrated normally (i.e., with differential set to false), and let the others be handled differentially. This is more important for orientation data than for pose data. As of the next release, all differential measurements will be converted to velocities ... (more)

edit flag offensive delete link more


So we need the rotation by the heading to align the sonar axis with the base_link axis, because odom would have the axis at ENU?

Raphael Nagel gravatar image Raphael Nagel  ( 2015-01-16 17:12:20 -0600 )edit

I was thinking of attaching a frame to the sonar beam with a dynamic transform from base_link to sonar, based on the yaw angle between the heading of the sonar beam and the heading of the robot. Would there be a difference in reporting it through base_link->sonar rather than via odom->sonar?

Raphael Nagel gravatar image Raphael Nagel  ( 2015-01-16 17:18:26 -0600 )edit

@EDIT 3: Ahh that makes sense, I confused the position in the covariance matrix with the measurement s to take into account matrix position in the ekf_**.launch file...

I will double check this in the next few days. Thank you so much for your help.

Raphael Nagel gravatar image Raphael Nagel  ( 2015-01-27 07:36:14 -0600 )edit

No problem! Let me know if you need anything else.

Tom Moore gravatar image Tom Moore  ( 2015-01-27 09:29:37 -0600 )edit

@EDIT 4: The differential setting was a relict from our previous thread. My IMU does not output the covariances and the driver was assuming some datasheet values. I am now calculating them myself at startup and both differential on/off work fine. Thank you very much.

Raphael Nagel gravatar image Raphael Nagel  ( 2015-02-05 09:04:40 -0600 )edit

No worries! Let me know if you have more questions.

Tom Moore gravatar image Tom Moore  ( 2015-02-05 19:04:03 -0600 )edit

@EDIT 7: Most of your assumptions are correct:

  • Yes, the independent rotations were worrying me.
  • approx. (-4,-2) is correct
  • Odom X axis == IMU 0 yaw == magnetic north
  • With differential on that makes sense.
Raphael Nagel gravatar image Raphael Nagel  ( 2015-02-13 18:36:39 -0600 )edit
  • I am basically doing 1-3 (separate x and y as of this post).
    • The transform in 1 is relative to the pool->base_link angle to account for the robot turning
    • I am trying my luck with a message_filter as my virtual sensor, transforming and repackaging both sonars into one message
Raphael Nagel gravatar image Raphael Nagel  ( 2015-02-13 18:42:59 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower


Asked: 2015-01-16 10:01:07 -0600

Seen: 540 times

Last updated: Feb 13 '15