ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

odometry/filtered from robot_localization not matching orientation input from imu

asked 2020-11-25 21:21:32 -0500

mugetsu gravatar image

updated 2020-12-10 16:32:08 -0500

I'm doing 3D state estimation with r_l and having some trouble. I have 3 sensors that I'm fusing:

  1. 2D wheel encoder based wheel odometry
  2. zed camera odom, spews both xyz and rpy. seems like rpy is a little unreliable so I'm only using xyz
  3. rpy from a relative imu

Looking at my fused data, my rpy appears to be all of the place and not using anything from the imu. In fact, I get the same results if I simply disable my imu0 in the r_l yaml file. What's going on with my data? Is the covariance of the imu too big such that the EKF is simply ignoring it?

here is my yaml file:

initial_state:  [0.0,  0.0,  0.59,
                 0.0,  0.0,  0,
                 0.0,  0.0,  0.0,
                 0.0,  0.0,  0.0,
                 0.0,  0.0,  0.0]
frequency: 10
two_d_mode: false

transform_time_offset: 0.2
transform_timeout: 0.0

print_diagnostics: false

debug: false

debug_out_file: /tmp/debug.txt

publish_tf: true

publish_acceleration: false
predict_to_current_time: true
map_frame: map              # Defaults to "map" if unspecified
odom_frame: odom            # Defaults to "odom" if unspecified
base_link_frame: base_link  # Defaults to "base_link" if unspecified
world_frame: odom          # Defaults to the value of odom_frame if unspecified

odom0: wheel_odom

#only use velocities from wheel_odom, abs val are erroneous due to slip. only vx,vy and vyaw
odom0_config: [false,  false, false,
             false,  false,  false,
             true, true, false,
              false, false, false,
             false, false, false]

odom0_queue_size: 5

odom0_differential: false

odom0_relative: false
use_control: false

#bottom facing VO camera

#using everything from vo for now
odom1: /camera_manager1/zed_node/odom
odom1_config: [true, true, true,
              false,false, false,
              false,false, false,
             false,false, false,
              false,false, false]
odom1_differential: false   
odom1_relative: false
#

imu0: /vectornav/IMU

#The IMU gives orientation, linear acceleration, and angular velocity
#use imu for velocity for vx,vy,vz, vrpy
imu0_config: [false, false, false,
            true, true,  true,
            false,  false,  false,
            false,  false,  false,
            false,  false,  false]

imu0_differential: false
imu0_relative: true

imu0_remove_gravitational_acceleration: true

bag is here: https://drive.google.com/file/d/10l0J...

wheel_odom:

header: 
  seq: 286175
  stamp: 
    secs: 1605734718
    nsecs: 849224511
  frame_id: "odom"
child_frame_id: "base_link"
pose: 
  pose: 
    position: 
      x: 1.26266401924
      y: -4.65020220554
      z: 0.572850814038
    orientation: 
      x: 0.0
      y: 0.0
      z: -0.914122786186
      w: 0.405437457292
  covariance: [0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.14, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.14, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.11]
twist: 
  twist: 
    linear: 
      x: -0.0172806155128
      y: -0.0188698232587
      z: 0.0
    angular: 
      x: 0.0
      y: 0.0
      z: -0.0459902520252
  covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0 ...
(more)
edit retag flag offensive close merge delete

Comments

Hi there, please include a sample message from every sensor input.

Tom Moore gravatar image Tom Moore  ( 2020-12-09 04:43:01 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2020-12-09 04:49:30 -0500

Tom Moore gravatar image

updated 2020-12-11 02:12:05 -0500

Without more information (see my comment about including sensor messages), my guess is that:

  1. You aren't measuring any of the 3D orientation variables from the non-IMU inputs (this makes sense, given what you said).
  2. Your IMU data isn't being respected. This is often because you don't have a transform defined from the IMU's frame to your robot's body frame.

The real issue is (2). If you want the filter to estimate your state, you have to provide a reference for every variable in the state, whether it's for the absolute dimension (e.g., RPY) or the velocities for those dimensions. If including the IMU data does nothing, then the filter is ignoring it.

BTW, your IMU config comment doesn't match your comment. You are fusing absolute orientation.

EDIT after updates:

Do you have a transform defined from base_link->imu_link anywhere? If not, you need one.

Regardless, yes, those covariances are utterly massive. The standard deviation for each angular dimension is ~13.2 radians. I'm not sure how you or you IMU driver are computing them, but the filter would definitely behave strangely (and largely ignore measurements) with covariances like that.

edit flag offensive delete link more

Comments

hi, thanks for taking a look. I've updated my post with the sensor input sample. It appears that the issue may be my IMU's covariance? Such large covariance makes the IMU not trusted much at all.

The IMU comment needs to be updated. I'm using fusing the orientation as a relative value.

mugetsu gravatar image mugetsu  ( 2020-12-10 16:34:52 -0500 )edit

Question Tools

1 follower

Stats

Asked: 2020-11-25 21:21:32 -0500

Seen: 586 times

Last updated: Dec 11 '20