Advice on improving pose estimation with robot_localization
Dear Tom Moore,
Let me start by thanking you on your excellent work on the robot_localization ROS package. I have been playing with it recently to estimate the pose of a differential-drive outdoor robot equipped with several sensors, and I would like to kindly ask your opinion about it.
Perhaps you could send me some tips on how to improve the pose estimation of the robot, especially the robot's orientation. Here is a video of the pose estimation that I get.
The Odometry estimation given by the EKF node is the dark orange/red one. Below, you have a list of the main topics available in my dataset:
- /laser/scan_filtered --> Laserscan data
- /mavros/global_position/local --> Odometry msg fusing GPS+IMU (from mavros: Brown in the video)
- /mavros/global_position/raw/fix --> GPS data
- /mavros/imu/data --> Mavros IMU data
- /robot/odom --> Encoder data (odometry: Green in the video)
- /robot/new_odom --> Encoder data (odometry with covarince -- added offline)
- /tf --> Transforms
- /uwb/multilateration_odom --> Multilateration (triangulation method providing global x,y,z)
- /yei_imu/data --> Our own IMU data
- /zed/odom_with_twist --> Visual odometry from the ZED Stereolabs outdoor camera (Blue in the video)
Although I have plenty of data, I am trying to fuse in a first stage the estimation given by the onboard Ultra Wide band (UWB) multilateration software (just positional data, no orientation given) + the robot encoders, which are decent + our IMU (on yei_imu/data).
However, as you can see, the estimated orientation of the robot is sometimes odd. I would expect the blue axis of the base_link frame (in the video) to always point up, and the red axis to always point forward. However, it is clear that especially the red axis points outwards sometimes, instead of pointing to the direction of movement. This is clear here:
Do you have any suggestion to improve the orientation estimation of my robot?
Also, I notice that for positional tracking, it doesn't seem to make much of a different to use just the UWB estimation, when compared to fusing UWB + robot encoders. I was expecting to smooth out the trajectory a bit, as the UWB data is subject to some jumps in positional data.
These are the params that I am currently using in the robot_localization software, in case you want to advise me to change anything.
Btw, I'm on ROS Kinetic, Ubuntu 16.04. Just some general guidelines and things that I could try from your perspective would be greatly appreciated. If you are interested in trying out my dataset, I can send a rosbag later.
Thank you in advance!
EDIT: posting config in-line:
frequency: 10
sensor_timeout: 0.25 #NOTE [D]: UWB works at 4Hz.
two_d_mode: false
transform_time_offset: 0.0
transform_timeout: 0.25
print_diagnostics: true
publish_tf: true
publish_acceleration: false
map_frame: map
odom_frame: odom
base_link_frame: base_link
world_frame: odom
# UWB (x,y,z):
odom0: uwb/multilateration_odom
odom0_config: [true, true, true, #x,y,z
false, false, false,
false, false, false,
false, false, false,
false, false, false]
odom0_differential: false
odom0_relative: false
odom0_queue_size: 2
odom0_pose_rejection_threshold: 3.0
odom0_twist_rejection_threshold: 1.0
odom0_nodelay: false
#ROBOT ODOMETRY
odom1: robot/new_odom
odom1_config: [false, false, false,
false, false, true, # yaw
true, false, false, # vx
false, false, true, # v_yaw
false, false, false]
odom1_differential: true
odom1_relative: true
odom1_queue_size: 10
odom1_pose_rejection_threshold: 3.0
odom1_twist_rejection_threshold: 1.0
odom1_nodelay: false
imu0: yei_imu/data #mavros/imu/data
imu0_config: [false, false, false, # x,y,z
true, true, true, # r,p,w
false, false, false, # vx,vy,vz
false, false, false, # vr,vp,vw
true, true, true] # ax,ay,az
imu0_nodelay: false
imu0_differential: false
imu0_relative: true # TRUE FOR SURE. DO NOT CHANGE THIS.
imu0_queue_size: 50
imu0_remove_gravitational_acceleration: true
imu0_pose_rejection_threshold: 0.8 # Note the difference in parameter names
imu0_twist_rejection_threshold: 0.8 #
imu0_linear_acceleration_rejection_threshold: 0.2
EDIT (2): Following Tom's reply, I am now fusing vyaw and vy from the robot odometry and got rid of the advanced parameters. I was not able to improve much of the initial results presented. However, one thing really puzzled me:
When I test with "mavros/imu/data" instead of the "yei_imu/data", the orientation of the robot is almost perfect and I get really good pose estimation results (I can post a video later so you can compare with the original one).
So I am now wondering what can be wrong with the yei_imu...
Firstly, both frameid's of my two IMUs are "baselink". Is it necessary to provide a static TF to each of them?
By using the IMU plugin for rviz, I also notice that both IMUs seem to report well the orientation in respect to the Euler Angles. However, they provide different absolute orientations, i.e. they have an offset between them.
This can be seen in Rviz below: The above shows the frames of the robot (red pointing forward, and actually facing north) + the mavros/imu_data.
The above shows the frames of the robot (red pointing forward, and actually facing north) + the yei_imu/data.
The above shows the frames of the robot (red pointing forward, and actually facing north) + the both imus (they are clearly not aligned).
Shouldn't they be aligned? Should I provide a static tf for yei_imu (rotating the right amount) for them to be aligned? Should they be aligned with the front of the robot in the beginning? Does this have to do with wrong convention (i.e. NED/ENU representation)? or something else? I'm really confused at this point regarding what I should do to fix the issue, as I'm not really an IMU expert.
Below, I also put a msg from both IMU's in the same instance:
yei_imu/data:
---
header:
seq: 22813
stamp:
secs: 1527267897
nsecs: 153892576
frame_id: base_link
orientation:
x: 0.0140349511057
y: 0.00752125261351
z: -0.950622081757
w: 0.309942185879
orientation_covariance: [0.000304607, 0.0, 0.0, 0.0, 0.000304607, 0.0, 0.0, 0.0, 0.000304607]
angular_velocity:
x: -0.00148047506809
y: 0.00922212563455
z: -0.00286923069507
angular_velocity_covariance: [0.001164, 0.0, 0.0, 0.0, 0.001164, 0.0, 0.0, 0.0, 0.001164]
linear_acceleration:
x: -0.0335188232422
y: -0.031124621582
z: 10.0125513428
linear_acceleration_covariance: [2.401e-05, 0.0, 0.0, 0.0, 2.401e-05, 0.0, 0.0, 0.0, 2.401e-05]
mavros/imu/data:
---
header:
seq: 11121
stamp:
secs: 1527267897
nsecs: 151121943
frame_id: base_link
orientation:
x: -0.0257356560841
y: 0.0238066779099
z: -0.964952790673
w: -0.260071201531
orientation_covariance: [0.001218428836, 0.0, 0.0, 0.0, 0.001218428836, 0.0, 0.0, 0.0, 0.001218428836]
angular_velocity:
x: -0.00344950705767
y: -0.00143239484169
z: 0.00183732784353
angular_velocity_covariance: [1.2184696791468346e-07, 0.0, 0.0, 0.0, 1.2184696791468346e-07, 0.0, 0.0, 0.0, 1.2184696791468346e-07]
linear_acceleration:
x: 0.06864655
y: 1.2958447453e-15
z: 10.58137535
linear_acceleration_covariance: [8.999999999999999e-08, 0.0, 0.0, 0.0, 8.999999999999999e-08, 0.0, 0.0, 0.0, 8.999999999999999e-08]
Asked by DavidPortugal on 2018-05-21 11:20:28 UTC
Answers
There's quite a bit going on here, and I'm afraid that without sample input messages from every sensor input, it may be hard to help.
However, I can comment on a few things.
- You are fusing yaw from two sources, and I'd be willing to bet they don't agree, at least not in the long term. Your odometry yaw is going to drift, and your IMU likely won't, so relative mode won't help. I'd pick one source for each absolute pose variable (unless you know they will always be reported in the same frame), and fuse the other sources as velocities, or turn on differential mode for them.
- Get rid of any advanced parameters, namely, the rejection thresholds. My advice is to always start with one sensor, make sure it's behaving as expected, then add other sensors, until it's working as expected. Once all of that works, then you can introduce the advanced params, if needed.
- If your robot can't move laterally, fuse the 0 value you're getting from your robot's wheel encoders for
Y
velocity. - Make your the frames of your sensor data match, or that a transform is being provided from each sensor to get the data into your
world_frame
. For example, let's say you drive straight forward for 10 meters, and your IMU says you are going at a heading ofpi/3
. Your UWB position, though, says you went from (0, 0) to (10, 0), which would imply a heading of0
. That means your sensors' coordinate frames don't agree, and you need to provide a transform between them (though that will assume that your UWB setup is fixed).
EDIT 1 in response to comments and updates:
- If you've updated your parameters, please update the parameters in the question as well. I still see the advanced parameters in there.
- Re: the comment below, if the UWB is producing a pose estimate in some world frame, then you don't want a
base_link
->uwb_frame
transform. You want a transform fromodom
->uwb_frame
. With the notable exception of IMUs, all pose data should typically be in theworld_frame
or have a static transform from the sensor frame directly to theworld_frame
. Likewise, velocities should always be reported in thebase_link_frame
, or there should be a static transform directly from the sensor frame to thebase_link_frame
. - If your IMUs are mounted at your vehicle's center and are mounted in their neutral orientation, then their
frame_id
being base_link makes sense. If they are offset from the vehicle's center, then you need static transforms to account for that. - Re: the IMUs, it's getting a bit out of scope for this question (you should probably post another one, or raise the issue with the package authors). However, if you have two IMUs and they both report absolute orientation, then yes, they ought to be within some reasonable distance of one another, assuming they are both mounted in the same orientation.
- You are fusing absolute yaw (with differential mode turned on) and yaw velocity from your wheel odometry. That's effectively counting the same data source twice. Get rid of absolute yaw, and then you can disable
- I still recommend starting with just wheel encoders. Get that looking like you want it, then add a single IMU. Once that's behaving, you can either add the other IMU, or switch to the other IMU. In any case, get that working correctly, then add the UWB. There are too many variables at play to solve everything at once.
- Finally, please post sample messages for all sensor inputs: wheel encoders, IMUs, and UWB data.
Asked by Tom Moore on 2018-05-24 06:49:59 UTC
Comments
Thanks! Did as u suggest in the first 3 points. wrt 4th point: I have base_link -> uwb_frame (the child frame id of the msgs coming from /uwb/multilateration_odom). Yet, both IMUs have frame_id=base_link. Should I provide static TFs for each of them? (pls see my update to the main question as well)
Asked by DavidPortugal on 2018-05-25 12:12:34 UTC
Could you please give me some details about how you calculated the covariance matrices for the imu?
Asked by Chubba_07 on 2019-11-09 06:52:05 UTC
Comments
If you could post a bag file that would be useful.
Asked by stevejp on 2018-05-22 14:39:38 UTC