ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

LIDAR data rotates when using EKF from Robot Localization

asked 2020-06-22 16:19:47 -0500

updated 2020-06-22 18:32:55 -0500


We've been trying to use the Robot Localization package, and we've been having trouble with the LIDAR data rotating in our map. Here is a video of the observed behavior -

As for our system configuration, we have wheel encoders from which we compute Odometry. We have a Swift Nav Piksi from which we get a RTK GPS fix based position (x,y,z, no orientation). After some initial tests, we realized we were missing a heading input, so we took the raw IMU and Mag data from the Swift Nav Piksi, fed it into the imu_madgwick_filter node, to get an absolute heading value.

The jitters in the video is coming from the IMU, as the robot moves however it seems like the LIDAR data has a tendency to yaw clockwise. Removing the IMU as an input source removes the jitter but the LIDAR still yaws just as much.

We've two EKF's -

  1. A map EKF that fuses wheel odometry + IMU + GPS Fixed position and provides map -> odom transform and
  2. An odom EKF that fuses wheel odometry + IMU and provides odom -> base_link transform.

These are our config files -

ekf_map.yaml -

  frequency: 50               # frequency, in Hz, at which the filter will output a position estimate
  sensor_timeout: 0.1         # period, in seconds, after which we consider a sensor to have timed out
  two_d_mode: true            # no 3D information will be used in your state estimate
  transform_time_offset: 0.1 # provide an offset to the transform generated by ekf_localization_node.

  print_diagnostics: true     # If you're having trouble, try setting this to true, and then echo
                              # the /diagnostics_agg topic
  publish_tf: true            # Whether to broadcast the transformation over the /tf topic
  publish_acceleration: false # Whether to publish the acceleration state

  map_frame: map              # Defaults to "map" if unspecified
  odom_frame: odom            # Defaults to "odom" if unspecified
  base_link_frame: base_link  # Defaults to "base_link" if unspecified
  world_frame: map           # Defaults to the value of odom_frame if unspecified

  # Fuse x and y velocities from wheel encoders
  odom0: /wheel_encoder/odom
  odom0_differential: false
  odom0_relative: false
  odom0_queue_size: 10
  odom0_config: [false, false, false,  # x, y, z
                 false, false, true,  # roll, pitch, yaw
                 true,  true,  false,  # x vel, y vel, z vel,
                 false, false, false,   # roll vel, pitch vel, yaw vel
                 false, false, false]  # x acc, y acc, z acc

  # Fuse Piksi INS orientation data
  imu0: /filtered_imu
  imu0_differential: false
  imu0_relative: false
  imu0_remove_gravitational_acceleration: false
  imu0_queue_size: 10
  imu0_config: [false, false, false,  # x, y, z
                false, false, true,  # roll, pitch, yaw
                false, false, false,  # x vel, y vel, z vel,
                false , false , false ,  # roll vel, pitch vel, yaw vel
                false , false , false ]  # x acc, y acc, z acc

  pose0: /piksi_rover/enu_pose_fix
  pose0_config: [true,  true,  false,
                 false, false, false,
                 false, false, false,
                 false, false, false,
                 false, false, false]
  pose0_differential: false
  pose0_relative: false
  pose0_queue_size: 10
  pose0_rejection_threshold: 5  # Note the difference in parameter name

  process_noise_covariance: [1e-2, 0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,       # x
                             0,    1e-2, 0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0 ...
edit retag flag offensive close merge delete

2 Answers

Sort by » oldest newest most voted

answered 2020-08-17 19:59:35 -0500

updated 2020-08-17 20:04:00 -0500

Thanks for the pointers @Tom Moore, your suggestions helped narrow down the problem.

I got rid of the map frame EKF altogether to simplify things and used a single KF. Fused the absolute yaw from the IMU and yaw velocity from the wheel encoders. (We need an earth referenced heading in our system). This improved performance but the drift was not completely eliminated.

The culprit was the disagreement between the RTK poses and IMU orientations (Point 3 in @Tom Moore's recommendations). With some tedious testing and adding a manual fudge factor to the IMU reported yaw we were able to get the data (LIDAR data and robot odometry) aligned together.

For our long term solution, we procured a better IMU and have had superior performance since.

edit flag offensive delete link more

answered 2020-07-10 04:17:08 -0500

Tom Moore gravatar image

I see a couple things:

  1. You are fusing absolute yaw data from two sources, wheel odometry and IMU. Those aren't going to agree over time. So the filter will fuse a measurement from the wheel encoder odometry, and the filter will jump towards that yaw value. Then it will fuse a measurement from the IMU, and it will jump towards that yaw value. So the state estimate will jump back and forth.
  2. If, after disabling the IMU, you are seeing a slow rotation in one direction, that says your wheel encoder odometry must be doing the same thing. Try disabling the IMU yaw value, and then plotting the raw wheel encoder yaw value vs. the filtered yaw value, and see what you get. I'm guessing the wheel encoder data is going to more or less match the filter.
  3. Just checking: when you "turn off" the IMU, are you just commenting it out? Because the parameter will remain unless you restart the ROS core. You'd need to give it a false topic name that doesn't exist.

My recommendations here would be to:

  1. Ignore the map frame EKF for a moment. Get the odom frame EKF behaving the way you want (visualize it in the odom frame), and then move on.
  2. Unless your solution requires an earth-referenced heading for state estimation, I'd just fuse yaw velocity from the IMU and the wheel encoders.
  3. Part of the reason for (2) is that your RTK poses may not agree with your IMU heading (and it definitely won't agree with your wheel encoder heading over time). So if your RTK says you moved from (0, 0) to (1, 1), then your yaw should be 45 degrees. But your IMU might read 20 degrees, which would give the robot the appearance of moving laterally.
edit flag offensive delete link more

Question Tools

1 follower


Asked: 2020-06-22 16:19:47 -0500

Seen: 912 times

Last updated: Aug 17 '20