Fusing odometry and IMU data causing turns to be compromised

asked 2021-09-20 21:15:26 -0500

QuaternionWXYZ gravatar image

updated 2021-09-24 08:17:30 -0500

Mike Scheutzow gravatar image

As the title suggests, when fusing odometry and IMU data, while on their own, perform as intended with no errors in any translation or rotation, but when fused using robot_localization, one direction of rotation significantly lags. There are no other issues with the movement, being that forward, backward translations are fine, and the other rotational direction behaves as intended. I can see this as when I visualize the IMU, odometry, and fused odometry data using rviz, all 3 data sets respectively behave as intended in their translations and rotations, with the exception of the fused odometry rotation.

I will attach the *.yaml file that sets up certain parameters and matrices to help facilitate the fusing process when I have greater than 5 karma. If anyone has experience with this specific issue and/or have suggestions for which course of action to take, please let me know. Any help or advice is greatly appreciated.

Thank you.

---------------.yaml below---------------
# The frequency, in Hz, at which the filter will output a position estimate. Note that the filter will not begin
# computation until it receives at least one message from one of the inputs. It will then run continuously at the
# frequency specified here, regardless of whether it receives more measurements. Defaults to 30 if unspecified.
frequency: 30

silent_tf_failure: false
# The period, in seconds, after which we consider a sensor to have timed out. In this event, we carry out a predict
# cycle on the EKF without correcting it. This parameter can be thought of as the minimum frequency with which the
# filter will generate new output. Defaults to 1 / frequency if not specified.
sensor_timeout: 0.1

# ekf_localization_node and ukf_localization_node both use a 3D omnidirectional motion model. If this parameter is
# set to true, no 3D information will be used in your state estimate. Use this if you are operating in a planar
# environment and want to ignore the effect of small variations in the ground plane that might otherwise be detected
# by, for example, an IMU. Defaults to false if unspecified.
two_d_mode: true

# Use this parameter to provide an offset to the transform generated by ekf_localization_node. This can be used for
# future dating the transform, which is required for interaction with some other packages. Defaults to 0.0 if
# unspecified.
transform_time_offset: 0.0

# Use this parameter to provide specify how long the tf listener should wait for a transform to become available. 
# Defaults to 0.0 if unspecified.
transform_timeout: 0.0

# If you're having trouble, try setting this to true, and then echo the /diagnostics_agg topic to see if the node is
# unhappy with any settings or data.
print_diagnostics: true

# Debug settings. Not for the faint of heart. Outputs a ludicrous amount of information to the file specified by
# debug_out_file. I hope you like matrices! Please note that setting this to true will have strongly deleterious
# effects on the performance of the node. Defaults to false if unspecified.
debug: false

# Defaults to "robot_localization_debug.txt" if unspecified. Please specify the full path.
#debug_out_file ...
(more)
edit retag flag offensive close merge delete

Comments

1

It's ok to edit your description and copy/paste the lines of text in there, even if it is long. Then format the lines using the 101010 button. For now, all we need to see is how you configure ekf_localization. You edit using the "edit" button at the end of your description.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2021-09-21 13:52:04 -0500 )edit

Thank you.

QuaternionWXYZ gravatar image QuaternionWXYZ  ( 2021-09-23 20:17:58 -0500 )edit

Please explain in detail what info you think is coming into this ekf filter (e.g. in terms of position, velocity or acceleration), and what output you want the filter to generate from that data.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2021-09-24 08:13:47 -0500 )edit

From the odometry, the x/y position is being sent for the pose, along with velocity x/velocity y, and velocity in the yaw (rotation about the z-axis) for the twist. From the imu, we are sending the yaw (rotational position about the z-axis) and the velocity yaw, which is also being sent from the odometry.

QuaternionWXYZ gravatar image QuaternionWXYZ  ( 2021-09-30 15:29:09 -0500 )edit

You have configured: odom0, pose0, twist0 and imu0 inputs to ekf_filter. There is a balance you need to find, and using too many conflicting inputs will make the ekf_filter output worse. What is the underlying data sources for each of those first 3 inputs?

Mike Scheutzow gravatar image Mike Scheutzow  ( 2021-10-02 06:54:11 -0500 )edit

pose() and twist() come from the Odom topic. IMU comes from IMU.

QuaternionWXYZ gravatar image QuaternionWXYZ  ( 2021-10-14 20:08:40 -0500 )edit

The goal of the EKF fusion is to hopefully optimize the rotational positions of the robot.

QuaternionWXYZ gravatar image QuaternionWXYZ  ( 2021-10-14 20:10:34 -0500 )edit