Ask Your Question

Revision history [back]

There's quite a bit going on here, and I'm afraid that without sample input messages from every sensor input, it may be hard to help.

However, I can comment on a few things.

  • You are fusing yaw from two sources, and I'd be willing to bet they don't agree, at least not in the long term. Your odometry yaw is going to drift, and your IMU likely won't, so relative mode won't help. I'd pick one source for each absolute pose variable (unless you know they will always be reported in the same frame), and fuse the other sources as velocities, or turn on differential mode for them.
  • Get rid of any advanced parameters, namely, the rejection thresholds. My advice is to always start with one sensor, make sure it's behaving as expected, then add other sensors, until it's working as expected. Once all of that works, then you can introduce the advanced params, if needed.
  • If your robot can't move laterally, fuse the 0 value you're getting from your robot's wheel encoders for Y velocity.
  • Make your the frames of your sensor data match, or that a transform is being provided from each sensor to get the data into your world_frame. For example, let's say you drive straight forward for 10 meters, and your IMU says you are going at a heading of pi/3. Your UWB position, though, says you went from (0, 0) to (10, 0), which would imply a heading of 0. That means your sensors' coordinate frames don't agree, and you need to provide a transform between them (though that will assume that your UWB setup is fixed).

There's quite a bit going on here, and I'm afraid that without sample input messages from every sensor input, it may be hard to help.

However, I can comment on a few things.

  • You are fusing yaw from two sources, and I'd be willing to bet they don't agree, at least not in the long term. Your odometry yaw is going to drift, and your IMU likely won't, so relative mode won't help. I'd pick one source for each absolute pose variable (unless you know they will always be reported in the same frame), and fuse the other sources as velocities, or turn on differential mode for them.
  • Get rid of any advanced parameters, namely, the rejection thresholds. My advice is to always start with one sensor, make sure it's behaving as expected, then add other sensors, until it's working as expected. Once all of that works, then you can introduce the advanced params, if needed.
  • If your robot can't move laterally, fuse the 0 value you're getting from your robot's wheel encoders for Y velocity.
  • Make your the frames of your sensor data match, or that a transform is being provided from each sensor to get the data into your world_frame. For example, let's say you drive straight forward for 10 meters, and your IMU says you are going at a heading of pi/3. Your UWB position, though, says you went from (0, 0) to (10, 0), which would imply a heading of 0. That means your sensors' coordinate frames don't agree, and you need to provide a transform between them (though that will assume that your UWB setup is fixed).

EDIT 1 in response to comments and updates:

  • If you've updated your parameters, please update the parameters in the question as well. I still see the advanced parameters in there.
  • Re: the comment below, if the UWB is producing a pose estimate in some world frame, then you don't want a base_link->uwb_frame transform. You want a transform from odom->uwb_frame. With the notable exception of IMUs, all pose data should typically be in the world_frame or have a static transform from the sensor frame directly to the world_frame. Likewise, velocities should always be reported in the base_link_frame, or there should be a static transform directly from the sensor frame to the base_link_frame.
  • If your IMUs are mounted at your vehicle's center and are mounted in their neutral orientation, then their frame_id being base_link makes sense. If they are offset from the vehicle's center, then you need static transforms to account for that.
  • Re: the IMUs, it's getting a bit out of scope for this question (you should probably post another one, or raise the issue with the package authors). However, if you have two IMUs and they both report absolute orientation, then yes, they ought to be within some reasonable distance of one another, assuming they are both mounted in the same orientation.
  • You are fusing absolute yaw (with differential mode turned on) and yaw velocity from your wheel odometry. That's effectively counting the same data source twice. Get rid of absolute yaw, and then you can disable
  • I still recommend starting with just wheel encoders. Get that looking like you want it, then add a single IMU. Once that's behaving, you can either add the other IMU, or switch to the other IMU. In any case, get that working correctly, then add the UWB. There are too many variables at play to solve everything at once.
  • Finally, please post sample messages for all sensor inputs: wheel encoders, IMUs, and UWB data.