ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

The only issue is that the direction of the GPS odometry seems to be rotated with respect to the other odometry

This says that you have your yaw_offset or magnetic_declination parameters set incorrectly. It could also be a timing thing, especially with the way you have your filter set up. More on that later.

First, you are fusing absolute pose data from four different sources:

/visual_slam//odometry: position and orientation /lidar/odom: position /imu/with/absolute/orientation: orientation /odometry/gps: position

(Your visual slam topic has two slashes in it, by the way).

Anyway, I sincerely doubt these sources are going to agree with one another over time, and so over time, the pose is going to start jumping between them, even without the GPS data. You should choose one as your most trustworthy pose source, and fuse velocity data from the others, or put them in differential mode if they don't provide velocity.

Just so we're clear, navsat_transform_node requires your IMU to read 0 facing east. The EKF doesn't care. navsat_transform_node needs three things:

  1. An IMU with an earth-referenced heading
  2. A GPS fix
  3. Your robot's current pose in its world frame

It doesn't care what (3) is, so long as it's right-handed. It doesn't have to agree at all with the GPS or IMU that you use in n_t_n. You could just use raw wheel odometry alone for your EKF state estimate, and it would work.

But you need to be careful here. If this hapens:

  1. EKF kicks off, gets a single IMU measurement from, e.g., imu3. So its first output has a pose and position of 0, with whatever angular velocity was in that message.
  2. navsat_transform_node is ready, and gets all its input data, including that first pose from the EKF (i.e., input 3 above). That pose is 0.
  3. EKF fuses pose from some other source. Let's say it's from your SLAM system, that says you are at some non-zero position and with non-zero orientation. Now you've effectively changed your EKF's coordinate frame (i.e., where its origin is). But navsat_transform_node has the wrong EKF pose, so when it transforms the GPS data, it will be wrong.

What you want is:

  1. EKF fuses data from all pose sources (again, you preferably one have one source for absolute pose)
  2. navsat_transform_node` is ready, and gets all its input data, including that first pose from the EKF
  3. Now all posese from n_t_n should be consistent with your robot's world frame.

This is why n_t_n has the delay parameter. It gives you time to get a sensor measurement from all your sensors into the EKF before it starts.

The only issue is that the direction of the GPS odometry seems to be rotated with respect to the other odometry

This says that you have your yaw_offset or magnetic_declination parameters set incorrectly. It could also be a timing thing, especially with the way you have your filter set up. More on that later.

First, you are fusing absolute pose data from four different sources:

  • /visual_slam//odometry: position and orientation orientation
  • /lidar/odom: position /imu/with/absolute/orientation: orientation /odometry/gps: position

  • /imu/with/absolute/orientation: orientation
  • /odometry/gps: position

(Your visual slam topic has two slashes in it, by the way).

Anyway, I sincerely doubt these sources are going to agree with one another over time, and so over time, the pose is going to start jumping between them, even without the GPS data. You should choose one as your most trustworthy pose source, and fuse velocity data from the others, or put them in differential mode if they don't provide velocity.

Just so we're clear, navsat_transform_node requires your IMU to read 0 facing east. The EKF doesn't care. navsat_transform_node needs three things:

  1. An IMU with an earth-referenced heading
  2. A GPS fix
  3. Your robot's current pose in its world frame

It doesn't care what (3) is, so long as it's right-handed. It doesn't have to agree at all with the GPS or IMU that you use in n_t_n. You could just use raw wheel odometry alone for your EKF state estimate, and it would work.

But you need to be careful here. If this hapens:

  1. EKF kicks off, gets a single IMU measurement from, e.g., imu3. So its first output has a pose and position of 0, with whatever angular velocity was in that message.
  2. navsat_transform_node is ready, and gets all its input data, including that first pose from the EKF (i.e., input 3 above). That pose is 0.
  3. EKF fuses pose from some other source. Let's say it's from your SLAM system, that says you are at some non-zero position and with non-zero orientation. Now you've effectively changed your EKF's coordinate frame (i.e., where its origin is). But navsat_transform_node has the wrong EKF pose, so when it transforms the GPS data, it will be wrong.

What you want is:

  1. EKF fuses data from all pose sources (again, you preferably one have one source for absolute pose)
  2. navsat_transform_node` is ready, and gets all its input data, including that first pose from the EKF
  3. Now all posese from n_t_n should be consistent with your robot's world frame.

This is why n_t_n has the delay parameter. It gives you time to get a sensor measurement from all your sensors into the EKF before it starts.

The only issue is that the direction of the GPS odometry seems to be rotated with respect to the other odometry

This says that you have your yaw_offset or magnetic_declination parameters set incorrectly. It could also be a timing thing, especially with the way you have your filter set up. More on that later.

First, you are fusing absolute pose data from four different sources:

  • /visual_slam//odometry: position and orientation
  • /lidar/odom: position
  • /imu/with/absolute/orientation: orientation
  • /odometry/gps: position

(Your visual slam topic has two slashes in it, by the way).

Anyway, I sincerely doubt these sources are going to agree with one another over time, and so over time, the pose is going to start jumping between them, even without the GPS data. You should choose one as your most trustworthy pose source, and fuse velocity data from the others, or put them in differential mode if they don't provide velocity.

Just so we're clear, navsat_transform_node requires your IMU to read 0 facing east. The EKF doesn't care. navsat_transform_node needs three things:

  1. An IMU with an earth-referenced heading
  2. A GPS fix
  3. Your robot's current pose in its world frame

It doesn't care what (3) is, so long as it's right-handed. It doesn't have to agree at all with the GPS or IMU that you use in n_t_n. You could just use raw wheel odometry alone for your EKF state estimate, and it would work.

But you need to be careful here. If this hapens:

  1. EKF kicks off, gets a single IMU measurement from, e.g., imu3. So its first output has a pose and position of 0, with whatever angular velocity was in that message.
  2. navsat_transform_node is ready, and gets all its input data, including that first pose from the EKF (i.e., input 3 above). That pose is 0.
  3. EKF fuses pose from some other source. Let's say it's from your SLAM system, that says you are at some non-zero position and with non-zero orientation. Now you've effectively changed your EKF's coordinate frame (i.e., where its origin is). But navsat_transform_node has the wrong EKF pose, so when it transforms the GPS data, it will be wrong.

What you want is:

  1. EKF fuses data from all pose sources (again, you preferably one have one source for absolute pose)
  2. navsat_transform_node` is ready, and gets all its input data, including that first pose from the EKF
  3. Now all posese from n_t_n should be consistent with your robot's world frame.

This is why n_t_n has the delay parameter. It gives you time to get a sensor measurement from all your sensors into the EKF before it starts.

Update in response to comments:

should I then set lidar odom to differential=true and gps as absolute x y source?

Up to you, really, but as you have it now, your pose will eventually diverge from the GPS path (lidar odometry will be integrating laser scan matches, I presume, so it will gain error over time).

except imu as i thought n_t_n requires absolute IMU orientation

It does. See the requirements for what n_t_n expects above. I think you may be conflating IMU inputs to your EKF instance and IMU inputs to n_t_n. They are totally unrelated, unless you tell n_t_n to use the state estimate's yaw (use_odometry_yaw), and you don't have that configured (and shouldn't).

is this under the assumption that I have a fix the moment the global map->base_link ekf begins?

Not really, no. When n_t_n starts, it receives input (3) above from the EKF. The first time the EKF publishes, it may not have received an IMU measurement, so the EKF's first output has a yaw of 0. So n_t_n computes the transform from the GPS (really, UTM) frame to the robot's world frame, but the transform is based on that first pose from the EKF, which has a 0 yaw. Then the EKF gets its first IMU measurement, and it has a yaw of, e.g., 2.7 radians. So from then on, the transformed pose coming from n_t_n will be inconsistent with the robot's direction of travel. Instead, if n_t_n had waited until the EKF had received its initial orientation, then it would have based that UTM-to-robot transform on the heading of 2.7 radians, and the transform would be different.

The easiest way around this is for your EKF to _only_ fuse orientation velocity (or at least yaw velocity). Then the robot's heading will start at 0, and will not suddenly jump in the next time step.