Ask Your Question

Revision history [back]

Advice on improving pose estimation with robot_localization

Dear Tom Moore,

Let me start by thanking you on your excellent work on the robot_localization ROS package. I have been playing with it recently to estimate the pose of a differential-drive outdoor robot equipped with several sensors, and I would like to kindly ask your opinion about it.

Perhaps you could send me some tips on how to improve the pose estimation of the robot, especially the robot's orientation. Here is a video of the pose estimation that I get.

The Odometry estimation given by the EKF node is the dark orange/red one. Below, you have a list of the main topics available in my dataset:

 - /laser/scan_filtered            --> Laserscan data 
 - /mavros/global_position/local   --> Odometry msg fusing GPS+IMU (from mavros: Brown in the video)
 - /mavros/global_position/raw/fix --> GPS data 
 - /mavros/imu/data                --> Mavros IMU data 
 - /robot/odom                     --> Encoder data (odometry: Green in the video) 
 - /robot/new_odom                 --> Encoder data (odometry with covarince -- added offline) 
 - /tf                             --> Transforms 
 - /uwb/multilateration_odom       --> Multilateration (triangulation method providing global x,y,z) 
 - /yei_imu/data                   --> Our own IMU data 
 - /zed/odom_with_twist            --> Visual odometry from the ZED Stereolabs outdoor camera (Blue in the video)

Although I have plenty of data, I am trying to fuse in a first stage the estimation given by the onboard Ultra Wide band (UWB) multilateration software (just positional data, no orientation given) + the robot encoders, which are decent + our IMU (on yei_imu/data).

However, as you can see, the estimated orientation of the robot is sometimes odd. I would expect the blue axis of the base_link frame (in the video) to always point up, and the red axis to always point forward. However, it is clear that especially the red axis points outwards sometimes, instead of pointing to the direction of movement. This is clear here:

image description

Do you have any suggestion to improve the orientation estimation of my robot?

Also, I notice that for positional tracking, it doesn't seem to make much of a different to use just the UWB estimation, when compared to fusing UWB + robot encoders. I was expecting to smooth out the trajectory a bit, as the UWB data is subject to some jumps in positional data.

These are the params that I am currently using in the robot_localization software, in case you want to advise me to change anything.

Btw, I'm on ROS Kinetic, Ubuntu 16.04. Just some general guidelines and things that I could try from your perspective would be greatly appreciated. If you are interested in trying out my dataset, I can send a rosbag later.

Thank you in advance!

Advice on improving pose estimation with robot_localization

Dear Tom Moore,

Let me start by thanking you on your excellent work on the robot_localization ROS package. I have been playing with it recently to estimate the pose of a differential-drive outdoor robot equipped with several sensors, and I would like to kindly ask your opinion about it.

Perhaps you could send me some tips on how to improve the pose estimation of the robot, especially the robot's orientation. Here is a video of the pose estimation that I get.

The Odometry estimation given by the EKF node is the dark orange/red one. Below, you have a list of the main topics available in my dataset:

 - /laser/scan_filtered            --> Laserscan data 
 - /mavros/global_position/local   --> Odometry msg fusing GPS+IMU (from mavros: Brown in the video)
 - /mavros/global_position/raw/fix --> GPS data 
 - /mavros/imu/data                --> Mavros IMU data 
 - /robot/odom                     --> Encoder data (odometry: Green in the video) 
 - /robot/new_odom                 --> Encoder data (odometry with covarince -- added offline) 
 - /tf                             --> Transforms 
 - /uwb/multilateration_odom       --> Multilateration (triangulation method providing global x,y,z) 
 - /yei_imu/data                   --> Our own IMU data 
 - /zed/odom_with_twist            --> Visual odometry from the ZED Stereolabs outdoor camera (Blue in the video)

Although I have plenty of data, I am trying to fuse in a first stage the estimation given by the onboard Ultra Wide band (UWB) multilateration software (just positional data, no orientation given) + the robot encoders, which are decent + our IMU (on yei_imu/data).

However, as you can see, the estimated orientation of the robot is sometimes odd. I would expect the blue axis of the base_link frame (in the video) to always point up, and the red axis to always point forward. However, it is clear that especially the red axis points outwards sometimes, instead of pointing to the direction of movement. This is clear here:

image description

Do you have any suggestion to improve the orientation estimation of my robot?

Also, I notice that for positional tracking, it doesn't seem to make much of a different to use just the UWB estimation, when compared to fusing UWB + robot encoders. I was expecting to smooth out the trajectory a bit, as the UWB data is subject to some jumps in positional data.

These are the params that I am currently using in the robot_localization software, in case you want to advise me to change anything.

Btw, I'm on ROS Kinetic, Ubuntu 16.04. Just some general guidelines and things that I could try from your perspective would be greatly appreciated. If you are interested in trying out my dataset, I can send a rosbag later.

Thank you in advance!

EDIT: posting config in-line:

    frequency: 10

    sensor_timeout: 0.25 #NOTE [D]: UWB works at 4Hz.

    two_d_mode: false
    transform_time_offset: 0.0
    transform_timeout: 0.25
    print_diagnostics: true

    publish_tf: true
    publish_acceleration: false

    map_frame: map
    odom_frame: odom
    base_link_frame: base_link
    world_frame: odom

    # UWB (x,y,z):
    odom0: uwb/multilateration_odom
    odom0_config: [true, true, true,    #x,y,z
                   false, false, false,
                   false, false, false,
                   false, false, false,
                   false, false, false]
    odom0_differential: false
    odom0_relative: false 
    odom0_queue_size: 2
    odom0_pose_rejection_threshold: 3.0
    odom0_twist_rejection_threshold: 1.0
    odom0_nodelay: false

    #ROBOT ODOMETRY
    odom1: robot/new_odom
    odom1_config: [false, false, false,
                   false, false, true,      # yaw
                   true, false, false,      # vx
                   false, false, true,      # v_yaw
                   false, false, false]
    odom1_differential: true
    odom1_relative: true
    odom1_queue_size: 10
    odom1_pose_rejection_threshold: 3.0
    odom1_twist_rejection_threshold: 1.0 
    odom1_nodelay: false

    imu0: yei_imu/data  #mavros/imu/data
    imu0_config: [false, false, false,       # x,y,z
                  true, true, true,          # r,p,w
                  false, false, false,       # vx,vy,vz
                  false, false, false,          # vr,vp,vw
                  true, true, true]       # ax,ay,az

    imu0_nodelay: false
    imu0_differential: false
    imu0_relative: true      # TRUE FOR SURE. DO NOT CHANGE THIS.
    imu0_queue_size: 50

    imu0_remove_gravitational_acceleration: true

    imu0_pose_rejection_threshold: 0.8                 # Note the difference in parameter names
    imu0_twist_rejection_threshold: 0.8                #
    imu0_linear_acceleration_rejection_threshold: 0.2  #

Advice on improving pose estimation with robot_localization

Dear Tom Moore,

Let me start by thanking you on your excellent work on the robot_localization ROS package. I have been playing with it recently to estimate the pose of a differential-drive outdoor robot equipped with several sensors, and I would like to kindly ask your opinion about it.

Perhaps you could send me some tips on how to improve the pose estimation of the robot, especially the robot's orientation. Here is a video of the pose estimation that I get.

The Odometry estimation given by the EKF node is the dark orange/red one. Below, you have a list of the main topics available in my dataset:

 - /laser/scan_filtered            --> Laserscan data 
 - /mavros/global_position/local   --> Odometry msg fusing GPS+IMU (from mavros: Brown in the video)
 - /mavros/global_position/raw/fix --> GPS data 
 - /mavros/imu/data                --> Mavros IMU data 
 - /robot/odom                     --> Encoder data (odometry: Green in the video) 
 - /robot/new_odom                 --> Encoder data (odometry with covarince -- added offline) 
 - /tf                             --> Transforms 
 - /uwb/multilateration_odom       --> Multilateration (triangulation method providing global x,y,z) 
 - /yei_imu/data                   --> Our own IMU data 
 - /zed/odom_with_twist            --> Visual odometry from the ZED Stereolabs outdoor camera (Blue in the video)

Although I have plenty of data, I am trying to fuse in a first stage the estimation given by the onboard Ultra Wide band (UWB) multilateration software (just positional data, no orientation given) + the robot encoders, which are decent + our IMU (on yei_imu/data).

However, as you can see, the estimated orientation of the robot is sometimes odd. I would expect the blue axis of the base_link frame (in the video) to always point up, and the red axis to always point forward. However, it is clear that especially the red axis points outwards sometimes, instead of pointing to the direction of movement. This is clear here:

image description

Do you have any suggestion to improve the orientation estimation of my robot?

Also, I notice that for positional tracking, it doesn't seem to make much of a different to use just the UWB estimation, when compared to fusing UWB + robot encoders. I was expecting to smooth out the trajectory a bit, as the UWB data is subject to some jumps in positional data.

These are the params that I am currently using in the robot_localization software, in case you want to advise me to change anything.

Btw, I'm on ROS Kinetic, Ubuntu 16.04. Just some general guidelines and things that I could try from your perspective would be greatly appreciated. If you are interested in trying out my dataset, I can send a rosbag later.

Thank you in advance!

EDIT: posting config in-line:

    frequency: 10

    sensor_timeout: 0.25 #NOTE [D]: UWB works at 4Hz.

    two_d_mode: false
    transform_time_offset: 0.0
    transform_timeout: 0.25
    print_diagnostics: true

    publish_tf: true
    publish_acceleration: false

    map_frame: map
    odom_frame: odom
    base_link_frame: base_link
    world_frame: odom

    # UWB (x,y,z):
    odom0: uwb/multilateration_odom
    odom0_config: [true, true, true,    #x,y,z
                   false, false, false,
                   false, false, false,
                   false, false, false,
                   false, false, false]
    odom0_differential: false
    odom0_relative: false 
    odom0_queue_size: 2
    odom0_pose_rejection_threshold: 3.0
    odom0_twist_rejection_threshold: 1.0
    odom0_nodelay: false

    #ROBOT ODOMETRY
    odom1: robot/new_odom
    odom1_config: [false, false, false,
                   false, false, true,      # yaw
                   true, false, false,      # vx
                   false, false, true,      # v_yaw
                   false, false, false]
    odom1_differential: true
    odom1_relative: true
    odom1_queue_size: 10
    odom1_pose_rejection_threshold: 3.0
    odom1_twist_rejection_threshold: 1.0 
    odom1_nodelay: false

    imu0: yei_imu/data  #mavros/imu/data
    imu0_config: [false, false, false,       # x,y,z
                  true, true, true,          # r,p,w
                  false, false, false,       # vx,vy,vz
                  false, false, false,          # vr,vp,vw
                  true, true, true]       # ax,ay,az

    imu0_nodelay: false
    imu0_differential: false
    imu0_relative: true      # TRUE FOR SURE. DO NOT CHANGE THIS.
    imu0_queue_size: 50

    imu0_remove_gravitational_acceleration: true

    imu0_pose_rejection_threshold: 0.8                 # Note the difference in parameter names
    imu0_twist_rejection_threshold: 0.8                #
    imu0_linear_acceleration_rejection_threshold: 0.2  #
0.2

EDIT (2): Following Tom's reply, I am now fusing vyaw and vy from the robot odometry and got rid of the advanced parameters. I was not able to improve much of the initial results presented. However, one thing really puzzled me:

When I test with "mavros/imu/data" instead of the "yei_imu/data", the orientation of the robot is almost perfect and I get really good pose estimation results (I can post a video later so you can compare with the original one).

So I am now wondering what can be wrong with the yei_imu...

Firstly, both frame_id's of my two IMUs are "base_link". Is it necessary to provide a static TF to each of them?

By using the IMU plugin for rviz, I also notice that both IMUs seem to report well the orientation in respect to the Euler Angles. However, they provide different absolute orientations, i.e. they have an offset between them.

This can be seen in Rviz below: image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the mavros/imu_data.

image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the yei_imu/data.

image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the both imus (they are clearly not aligned).

Shouldn't they be aligned? Should I provide a static tf for each of them (rotating the right amount) for them to be aligned with the front of the robot? Or something else?

Below, I also put some two msgs from both IMU's in the same instance:

yei_imu/data:

    ---
    header: 
      seq: 22813
      stamp: 
        secs: 1527267897
        nsecs: 153892576
      frame_id: base_link
    orientation: 
      x: 0.0140349511057
      y: 0.00752125261351
      z: -0.950622081757
      w: 0.309942185879
    orientation_covariance: [0.000304607, 0.0, 0.0, 0.0, 0.000304607, 0.0, 0.0, 0.0, 0.000304607]
    angular_velocity: 
      x: -0.00148047506809
      y: 0.00922212563455
      z: -0.00286923069507
    angular_velocity_covariance: [0.001164, 0.0, 0.0, 0.0, 0.001164, 0.0, 0.0, 0.0, 0.001164]
    linear_acceleration: 
      x: -0.0335188232422
      y: -0.031124621582
      z: 10.0125513428
    linear_acceleration_covariance: [2.401e-05, 0.0, 0.0, 0.0, 2.401e-05, 0.0, 0.0, 0.0, 2.401e-05]

mavros/imu/data:

    ---
    header: 
      seq: 11121
      stamp: 
        secs: 1527267897
        nsecs: 151121943
      frame_id: base_link
    orientation: 
      x: -0.0257356560841
      y: 0.0238066779099
      z: -0.964952790673
      w: -0.260071201531
    orientation_covariance: [0.001218428836, 0.0, 0.0, 0.0, 0.001218428836, 0.0, 0.0, 0.0, 0.001218428836]
    angular_velocity: 
      x: -0.00344950705767
      y: -0.00143239484169
      z: 0.00183732784353
    angular_velocity_covariance: [1.2184696791468346e-07, 0.0, 0.0, 0.0, 1.2184696791468346e-07, 0.0, 0.0, 0.0, 1.2184696791468346e-07]
    linear_acceleration: 
      x: 0.06864655
      y: 1.2958447453e-15
      z: 10.58137535
    linear_acceleration_covariance: [8.999999999999999e-08, 0.0, 0.0, 0.0, 8.999999999999999e-08, 0.0, 0.0, 0.0, 8.999999999999999e-08]

Advice on improving pose estimation with robot_localization

Dear Tom Moore,

Let me start by thanking you on your excellent work on the robot_localization ROS package. I have been playing with it recently to estimate the pose of a differential-drive outdoor robot equipped with several sensors, and I would like to kindly ask your opinion about it.

Perhaps you could send me some tips on how to improve the pose estimation of the robot, especially the robot's orientation. Here is a video of the pose estimation that I get.

The Odometry estimation given by the EKF node is the dark orange/red one. Below, you have a list of the main topics available in my dataset:

 - /laser/scan_filtered            --> Laserscan data 
 - /mavros/global_position/local   --> Odometry msg fusing GPS+IMU (from mavros: Brown in the video)
 - /mavros/global_position/raw/fix --> GPS data 
 - /mavros/imu/data                --> Mavros IMU data 
 - /robot/odom                     --> Encoder data (odometry: Green in the video) 
 - /robot/new_odom                 --> Encoder data (odometry with covarince -- added offline) 
 - /tf                             --> Transforms 
 - /uwb/multilateration_odom       --> Multilateration (triangulation method providing global x,y,z) 
 - /yei_imu/data                   --> Our own IMU data 
 - /zed/odom_with_twist            --> Visual odometry from the ZED Stereolabs outdoor camera (Blue in the video)

Although I have plenty of data, I am trying to fuse in a first stage the estimation given by the onboard Ultra Wide band (UWB) multilateration software (just positional data, no orientation given) + the robot encoders, which are decent + our IMU (on yei_imu/data).

However, as you can see, the estimated orientation of the robot is sometimes odd. I would expect the blue axis of the base_link frame (in the video) to always point up, and the red axis to always point forward. However, it is clear that especially the red axis points outwards sometimes, instead of pointing to the direction of movement. This is clear here:

image description

Do you have any suggestion to improve the orientation estimation of my robot?

Also, I notice that for positional tracking, it doesn't seem to make much of a different to use just the UWB estimation, when compared to fusing UWB + robot encoders. I was expecting to smooth out the trajectory a bit, as the UWB data is subject to some jumps in positional data.

These are the params that I am currently using in the robot_localization software, in case you want to advise me to change anything.

Btw, I'm on ROS Kinetic, Ubuntu 16.04. Just some general guidelines and things that I could try from your perspective would be greatly appreciated. If you are interested in trying out my dataset, I can send a rosbag later.

Thank you in advance!

EDIT: posting config in-line:

    frequency: 10

    sensor_timeout: 0.25 #NOTE [D]: UWB works at 4Hz.

    two_d_mode: false
    transform_time_offset: 0.0
    transform_timeout: 0.25
    print_diagnostics: true

    publish_tf: true
    publish_acceleration: false

    map_frame: map
    odom_frame: odom
    base_link_frame: base_link
    world_frame: odom

    # UWB (x,y,z):
    odom0: uwb/multilateration_odom
    odom0_config: [true, true, true,    #x,y,z
                   false, false, false,
                   false, false, false,
                   false, false, false,
                   false, false, false]
    odom0_differential: false
    odom0_relative: false 
    odom0_queue_size: 2
    odom0_pose_rejection_threshold: 3.0
    odom0_twist_rejection_threshold: 1.0
    odom0_nodelay: false

    #ROBOT ODOMETRY
    odom1: robot/new_odom
    odom1_config: [false, false, false,
                   false, false, true,      # yaw
                   true, false, false,      # vx
                   false, false, true,      # v_yaw
                   false, false, false]
    odom1_differential: true
    odom1_relative: true
    odom1_queue_size: 10
    odom1_pose_rejection_threshold: 3.0
    odom1_twist_rejection_threshold: 1.0 
    odom1_nodelay: false

    imu0: yei_imu/data  #mavros/imu/data
    imu0_config: [false, false, false,       # x,y,z
                  true, true, true,          # r,p,w
                  false, false, false,       # vx,vy,vz
                  false, false, false,          # vr,vp,vw
                  true, true, true]       # ax,ay,az

    imu0_nodelay: false
    imu0_differential: false
    imu0_relative: true      # TRUE FOR SURE. DO NOT CHANGE THIS.
    imu0_queue_size: 50

    imu0_remove_gravitational_acceleration: true

    imu0_pose_rejection_threshold: 0.8                 # Note the difference in parameter names
    imu0_twist_rejection_threshold: 0.8                #
    imu0_linear_acceleration_rejection_threshold: 0.2

EDIT (2): Following Tom's reply, I am now fusing vyaw and vy from the robot odometry and got rid of the advanced parameters. I was not able to improve much of the initial results presented. However, one thing really puzzled me:

When I test with "mavros/imu/data" instead of the "yei_imu/data", the orientation of the robot is almost perfect and I get really good pose estimation results (I can post a video later so you can compare with the original one).

So I am now wondering what can be wrong with the yei_imu...

Firstly, both frame_id's of my two IMUs are "base_link". Is it necessary to provide a static TF to each of them?

By using the IMU plugin for rviz, I also notice that both IMUs seem to report well the orientation in respect to the Euler Angles. However, they provide different absolute orientations, i.e. they have an offset between them.

This can be seen in Rviz below: image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the mavros/imu_data.

image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the yei_imu/data.

image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the both imus (they are clearly not aligned).

Shouldn't they be aligned? Should I provide a static tf for each of them (rotating the right amount) for them to be aligned with the front of the robot? Or something else?

Below, I also put some two msgs a msg from both IMU's in the same instance:

yei_imu/data:

    ---
    header: 
      seq: 22813
      stamp: 
        secs: 1527267897
        nsecs: 153892576
      frame_id: base_link
    orientation: 
      x: 0.0140349511057
      y: 0.00752125261351
      z: -0.950622081757
      w: 0.309942185879
    orientation_covariance: [0.000304607, 0.0, 0.0, 0.0, 0.000304607, 0.0, 0.0, 0.0, 0.000304607]
    angular_velocity: 
      x: -0.00148047506809
      y: 0.00922212563455
      z: -0.00286923069507
    angular_velocity_covariance: [0.001164, 0.0, 0.0, 0.0, 0.001164, 0.0, 0.0, 0.0, 0.001164]
    linear_acceleration: 
      x: -0.0335188232422
      y: -0.031124621582
      z: 10.0125513428
    linear_acceleration_covariance: [2.401e-05, 0.0, 0.0, 0.0, 2.401e-05, 0.0, 0.0, 0.0, 2.401e-05]

mavros/imu/data:

    ---
    header: 
      seq: 11121
      stamp: 
        secs: 1527267897
        nsecs: 151121943
      frame_id: base_link
    orientation: 
      x: -0.0257356560841
      y: 0.0238066779099
      z: -0.964952790673
      w: -0.260071201531
    orientation_covariance: [0.001218428836, 0.0, 0.0, 0.0, 0.001218428836, 0.0, 0.0, 0.0, 0.001218428836]
    angular_velocity: 
      x: -0.00344950705767
      y: -0.00143239484169
      z: 0.00183732784353
    angular_velocity_covariance: [1.2184696791468346e-07, 0.0, 0.0, 0.0, 1.2184696791468346e-07, 0.0, 0.0, 0.0, 1.2184696791468346e-07]
    linear_acceleration: 
      x: 0.06864655
      y: 1.2958447453e-15
      z: 10.58137535
    linear_acceleration_covariance: [8.999999999999999e-08, 0.0, 0.0, 0.0, 8.999999999999999e-08, 0.0, 0.0, 0.0, 8.999999999999999e-08]

Advice on improving pose estimation with robot_localization

Dear Tom Moore,

Let me start by thanking you on your excellent work on the robot_localization ROS package. I have been playing with it recently to estimate the pose of a differential-drive outdoor robot equipped with several sensors, and I would like to kindly ask your opinion about it.

Perhaps you could send me some tips on how to improve the pose estimation of the robot, especially the robot's orientation. Here is a video of the pose estimation that I get.

The Odometry estimation given by the EKF node is the dark orange/red one. Below, you have a list of the main topics available in my dataset:

 - /laser/scan_filtered            --> Laserscan data 
 - /mavros/global_position/local   --> Odometry msg fusing GPS+IMU (from mavros: Brown in the video)
 - /mavros/global_position/raw/fix --> GPS data 
 - /mavros/imu/data                --> Mavros IMU data 
 - /robot/odom                     --> Encoder data (odometry: Green in the video) 
 - /robot/new_odom                 --> Encoder data (odometry with covarince -- added offline) 
 - /tf                             --> Transforms 
 - /uwb/multilateration_odom       --> Multilateration (triangulation method providing global x,y,z) 
 - /yei_imu/data                   --> Our own IMU data 
 - /zed/odom_with_twist            --> Visual odometry from the ZED Stereolabs outdoor camera (Blue in the video)

Although I have plenty of data, I am trying to fuse in a first stage the estimation given by the onboard Ultra Wide band (UWB) multilateration software (just positional data, no orientation given) + the robot encoders, which are decent + our IMU (on yei_imu/data).

However, as you can see, the estimated orientation of the robot is sometimes odd. I would expect the blue axis of the base_link frame (in the video) to always point up, and the red axis to always point forward. However, it is clear that especially the red axis points outwards sometimes, instead of pointing to the direction of movement. This is clear here:

image description

Do you have any suggestion to improve the orientation estimation of my robot?

Also, I notice that for positional tracking, it doesn't seem to make much of a different to use just the UWB estimation, when compared to fusing UWB + robot encoders. I was expecting to smooth out the trajectory a bit, as the UWB data is subject to some jumps in positional data.

These are the params that I am currently using in the robot_localization software, in case you want to advise me to change anything.

Btw, I'm on ROS Kinetic, Ubuntu 16.04. Just some general guidelines and things that I could try from your perspective would be greatly appreciated. If you are interested in trying out my dataset, I can send a rosbag later.

Thank you in advance!

EDIT: posting config in-line:

    frequency: 10

    sensor_timeout: 0.25 #NOTE [D]: UWB works at 4Hz.

    two_d_mode: false
    transform_time_offset: 0.0
    transform_timeout: 0.25
    print_diagnostics: true

    publish_tf: true
    publish_acceleration: false

    map_frame: map
    odom_frame: odom
    base_link_frame: base_link
    world_frame: odom

    # UWB (x,y,z):
    odom0: uwb/multilateration_odom
    odom0_config: [true, true, true,    #x,y,z
                   false, false, false,
                   false, false, false,
                   false, false, false,
                   false, false, false]
    odom0_differential: false
    odom0_relative: false 
    odom0_queue_size: 2
    odom0_pose_rejection_threshold: 3.0
    odom0_twist_rejection_threshold: 1.0
    odom0_nodelay: false

    #ROBOT ODOMETRY
    odom1: robot/new_odom
    odom1_config: [false, false, false,
                   false, false, true,      # yaw
                   true, false, false,      # vx
                   false, false, true,      # v_yaw
                   false, false, false]
    odom1_differential: true
    odom1_relative: true
    odom1_queue_size: 10
    odom1_pose_rejection_threshold: 3.0
    odom1_twist_rejection_threshold: 1.0 
    odom1_nodelay: false

    imu0: yei_imu/data  #mavros/imu/data
    imu0_config: [false, false, false,       # x,y,z
                  true, true, true,          # r,p,w
                  false, false, false,       # vx,vy,vz
                  false, false, false,          # vr,vp,vw
                  true, true, true]       # ax,ay,az

    imu0_nodelay: false
    imu0_differential: false
    imu0_relative: true      # TRUE FOR SURE. DO NOT CHANGE THIS.
    imu0_queue_size: 50

    imu0_remove_gravitational_acceleration: true

    imu0_pose_rejection_threshold: 0.8                 # Note the difference in parameter names
    imu0_twist_rejection_threshold: 0.8                #
    imu0_linear_acceleration_rejection_threshold: 0.2

EDIT (2): Following Tom's reply, I am now fusing vyaw and vy from the robot odometry and got rid of the advanced parameters. I was not able to improve much of the initial results presented. However, one thing really puzzled me:

When I test with "mavros/imu/data" instead of the "yei_imu/data", the orientation of the robot is almost perfect and I get really good pose estimation results (I can post a video later so you can compare with the original one).

So I am now wondering what can be wrong with the yei_imu...

Firstly, both frame_id's of my two IMUs are "base_link". Is it necessary to provide a static TF to each of them?

By using the IMU plugin for rviz, I also notice that both IMUs seem to report well the orientation in respect to the Euler Angles. However, they provide different absolute orientations, i.e. they have an offset between them.

This can be seen in Rviz below: image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the mavros/imu_data.

image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the yei_imu/data.

image description The above shows the frames of the robot (red pointing forward, and actually facing north) + the both imus (they are clearly not aligned).

Shouldn't they be aligned? Should I provide a static tf for each of them yei_imu (rotating the right amount) for them to be aligned? Should they be aligned with the front of the robot? Or robot in the beginning? Does this have to do with wrong convention (i.e. NED/ENU representation)? or something else?else? I'm really confused at this point regarding what I should do to fix the issue, as I'm not really an IMU expert.

Below, I also put a msg from both IMU's in the same instance:

yei_imu/data:

    ---
    header: 
      seq: 22813
      stamp: 
        secs: 1527267897
        nsecs: 153892576
      frame_id: base_link
    orientation: 
      x: 0.0140349511057
      y: 0.00752125261351
      z: -0.950622081757
      w: 0.309942185879
    orientation_covariance: [0.000304607, 0.0, 0.0, 0.0, 0.000304607, 0.0, 0.0, 0.0, 0.000304607]
    angular_velocity: 
      x: -0.00148047506809
      y: 0.00922212563455
      z: -0.00286923069507
    angular_velocity_covariance: [0.001164, 0.0, 0.0, 0.0, 0.001164, 0.0, 0.0, 0.0, 0.001164]
    linear_acceleration: 
      x: -0.0335188232422
      y: -0.031124621582
      z: 10.0125513428
    linear_acceleration_covariance: [2.401e-05, 0.0, 0.0, 0.0, 2.401e-05, 0.0, 0.0, 0.0, 2.401e-05]

mavros/imu/data:

    ---
    header: 
      seq: 11121
      stamp: 
        secs: 1527267897
        nsecs: 151121943
      frame_id: base_link
    orientation: 
      x: -0.0257356560841
      y: 0.0238066779099
      z: -0.964952790673
      w: -0.260071201531
    orientation_covariance: [0.001218428836, 0.0, 0.0, 0.0, 0.001218428836, 0.0, 0.0, 0.0, 0.001218428836]
    angular_velocity: 
      x: -0.00344950705767
      y: -0.00143239484169
      z: 0.00183732784353
    angular_velocity_covariance: [1.2184696791468346e-07, 0.0, 0.0, 0.0, 1.2184696791468346e-07, 0.0, 0.0, 0.0, 1.2184696791468346e-07]
    linear_acceleration: 
      x: 0.06864655
      y: 1.2958447453e-15
      z: 10.58137535
    linear_acceleration_covariance: [8.999999999999999e-08, 0.0, 0.0, 0.0, 8.999999999999999e-08, 0.0, 0.0, 0.0, 8.999999999999999e-08]