Base link oscillating between Encoder odom and filtered odom from robot localization package
Hello,
I made a differential drive mobile robot using move base and it used encoder ticks from wheel encoder to calculate odometry and i used this package link text
The robot faced localization problem due to wheel slippage so, To improve localization i fused odom from wheel encoder and imu data from realsense L515 USing ROBOT-LOCALIZATION PACKAGE. after fusing it the filtured odom is not accurate and the tf of baselink is keep oscillating between encoder /diif/odom and the /odom from robot localization. Also i rotated the robot wheel by keeping the robot stationary as a result both the odom is rotating.
BELOW IS MY EKF YAML FILE
After going through the robot localization documentation completly, I tried every example in the configuring robot localization docs page link text no improvement.
frequency: 30
two_d_mode: true
publish_tf: true
map_frame: map
odom_frame: odom
base_link_frame: base_link
world_frame: odom
# odom from the wheel encoders
odom0: /diff/odom
odom0_config: [false, false, false,
false, false, true,
true, true, false,
false, false, false,
false, false, false]
odom0_differential: true
#odom0_relative: true
imu0_remove_gravitational_acceleration: true
## IMU 0 data from the intel L515
imu0: /imu/dataa
imu0_config: [false, false, false,
false, false, true,
false, false, false,
false, false, false,
true, false, false]
#imu_linear_acceleration_rejection_threshold: 0.5
#imu_angular_velocity_rejection_threshold: 0.5
#imu0_relative: false
imu0_differential: false
MY SAMPLE ODOM DATA (/diff/odom) generated from wheel encoders. which works pretty good, static output when the robot is not moving and tested it with BOX TEST
header:
seq: 1991
stamp:
secs: 1683974534
nsecs: 421610116
frame_id: "odom"
child_frame_id: "base_link"
pose:
pose:
position:
x: -0.045909170121061965
y: 0.0009563543501025512
z: 0.0
orientation:
x: 0.0
y: 0.0
z: -0.009476805004755682
w: 0.9999550940751799
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
twist:
twist:
linear:
x: 0.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
For the imu data i using a l515 realsense camera's imu. i ran the following cmd
roslaunch realsense2_camera rs_camera.launch enable_gyro:=true enable_accel:=true
which publishes accleration and gyro in two topic below is accleration data
header:
seq: 38099
stamp:
secs: 1683970284
nsecs: 140621662
frame_id: "camera_accel_optical_frame"
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 0.0
orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity:
x: 0.0
y: 0.0
z: 0.0
angular_velocity_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
linear_acceleration:
x: -0.009806649759411812
y: -9.688969612121582
z: 0.4903324842453003
linear_acceleration_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
Below is gyro data
header:
seq: 67908
stamp:
secs: 1683970434
nsecs: 514808655
frame_id: "camera_gyro_optical_frame"
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 0.0
orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity:
x: -0.003490658476948738
y: -0.001745329238474369
z: -0.003490658476948738
angular_velocity_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
linear_acceleration:
x: 0.0
y: 0.0
z: 0.0
linear_acceleration_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
Later i wrote a python script to merge both acc and gyro data in a single topic called /imu/data, pub in a frame id called " cameralink". the camera link frame is (NWU)orientation like the baselink and added the below line of code to change the signs as mentioned in docs of robot localization package
def gyro_callback(self,msg):
# self.gyro_msg = msg.angular_velocity
self.gyro_msg_x = msg.angular_velocity.x
self.gyro_msg_y = -msg.angular_velocity.y
self.gyro_msg_z = -msg.angular_velocity.z
pass
def accel_callback(self,msg):
# self.accel_msg = msg.linear_acceleration
self.accel_msg_x = msg.linear_acceleration.x
self.accel_msg_y = -msg.linear_acceleration.y
self.accel_msg_z = -msg.linear_acceleration.z
pass
def combine(self):
self.Imu_msg.header.stamp = rospy.Time().now()
# self.Imu_msg.linear_acceleration = self.accel_msg
# self.Imu_msg.angular_velocity = self.gyro_msg
self.Imu_msg.angular_velocity.x = self.gyro_msg_z
self.Imu_msg.angular_velocity.y = self.gyro_msg_x
self.Imu_msg.angular_velocity.z = self.gyro_msg_y
self.Imu_msg.linear_acceleration.x = self.accel_msg_z
self.Imu_msg.linear_acceleration.y = self.accel_msg_x
self.Imu_msg.linear_acceleration.z = self.accel_msg_y
header:
seq: 34470
stamp:
secs: 1683974671
nsecs: 44133186
frame_id: "camera_link"
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 0.0
orientation_covariance: [1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity:
x: 0.003490658476948738
y: -0.012217304669320583
z: -0.0
angular_velocity_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
linear_acceleration:
x: -0.4412992298603058
y: 0.029419949278235435
z: -9.904716491699219
linear_acceleration_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
Finally i used imufiltermadgwick to calculate orientations and publish the data in "/imu/dataa" with same ref frame i.e "camera_link"
header:
seq: 33317
stamp:
secs: 1683974665
nsecs: 454127788
frame_id: "camera_link"
orientation:
x: -0.975022266924628
y: -0.22210349572453683
z: -0.0011523665941609964
w: -0.000536879253128781
orientation_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity:
x: 0.005235987715423107
y: 0.001745329238474369
z: 0.001745329238474369
angular_velocity_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
linear_acceleration:
x: 0.17651969194412231
y: 0.029419949278235435
z: -9.747809410095215
linear_acceleration_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
Finally after fusion the encoder odom "/diff/odom and imu data "/imu/data" the filter producing a odom in which is a bit stable after launching move base and everything for navigation( because the encoder odom and the filtured odom are at the same place). After moveing the robot to few goal locations bothe the odom were seperated by some distance and the base_link is keep oscillating between both odoms and localization is failing and this warning comes in rviz terminal.
[ WARN] [1683980378.552122842]: Costmap2DROS transform timeout. Current time: 1683980378.5521, global_pose stamp: 1683980378.0468, tolerance: 0.5000
[ WARN] [1683980378.626111628]: Could not get robot pose, cancelling reconfiguration
[ WARN] [1683980379.562175122]: Costmap2DROS transform timeout. Current time: 1683980379.5621, global_pose stamp: 1683980378.9682, tolerance: 0.5000
[ WARN] [1683980379.626527157]: Could not get robot pose, cancelling reconfiguration
[ WARN] [1683980380.572120742]: Costmap2DROS transform timeout. Current time: 1683980380.5721, global_pose stamp: 1683980379.8367, tolerance: 0.5000
[ WARN] [1683980380.726186120]: Could not get robot pose, cancelling reconfiguration
[ WARN] [1683980381.572142327]: Costmap2DROS transform timeout. Current time: 1683980381.5721, global_pose stamp: 1683980380.9759, tolerance: 0.5000
[ WARN] [1683980381.826115613]: Could not get robot pose, cancelling reconfiguration
[ WARN] [1683980382.582096515]: Costmap2DROS transform timeout. Current time: 1683980382.5821, global_pose stamp: 1683980381.7881, tolerance: 0.5000
[ WARN] [1683980382.826116742]: Could not get robot pose, cancelling reconfiguration
[ WARN] [1683980383.582113600]: Costmap2DROS transform timeout. Current time: 1683980383.5821, global_pose stamp: 1683980382.7107, tolerance: 0.5000
[ WARN] [1683980383.826157371]: Could not get robot pose, cancelling reconfiguration
Below is the odom generated by the ekf node robot localization package
Header:
seq: 5947
stamp:
secs: 1683974533
nsecs: 614134550
frame_id: "odom"
child_frame_id: "base_link"
pose:
pose:
position:
x: -0.008143414470697219
y: 0.0026879490663362176
z: 0.0
orientation:
x: 0.0
y: 0.0
z: -0.11410798900099858
w: 0.9934683522116585
covariance: [9.916338966693454, 0.004858011791483132, 0.0, 0.0, 0.0, 0.07085640422939016, 0.0048580117914832684, 9.95254786216812, 0.0, 0.0, 0.0, 0.4679562661655844, 0.0, 0.0, 4.995827547588767e-07, -2.3643689494352754e-23, -1.2347460179269096e-17, 0.0, 0.0, 0.0, -2.3643689494352754e-23, 4.991668982909007e-07, -6.996822699637586e-31, 0.0, 0.0, 0.0, -1.23474601792691e-17, -6.996842691797004e-31, 4.991668982909007e-07, 0.0, 0.07085640422939002, 0.46795626616558383, 0.0, 0.0, 0.0, 11.894537364842147]
twist:
twist:
linear:
x: 0.002945414185196462
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: -0.005054415004390783
covariance: [0.0023165662163858276, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0024499726299258996, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.993746534694231e-07, -1.4505032148156e-32, -7.537589127757287e-27, 0.0, 0.0, 0.0, -1.4505032148156001e-32, 4.975171875545832e-07, -1.6894730619944164e-39, 0.0, 0.0, 0.0, -7.537589127757286e-27, -1.6894710908138694e-39, 4.975171875545832e-07, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 9.999949847400107e-10]
NOTE:
1, note In my robot the camera link orientation is same like base_link its (NWU) so that my velocity will be positive if my robot rotates anticlockwise and i tried with both positive 9.81 and negative 9.81 in z axis 2, For imu data if i lift my robot from left side, example if the positive y axis goes up 90 degree, my y axis gives +9.81 3, if the forward facing x axis pitches downward +90 degree -9.81 is measured in x axis. 4, no problem in tf its all good.
the only problem i am facing is the base is of the robot is keep oscillating between the encoder odom /diff/odom and filtered generated by the robot localization package and the filtered odom is performing same as the odom generated from encoder. i mean it not performing well in slippery condition
PLEASE HELP ME THANKS IN ADVANCE.
Asked by gowtham128 on 2023-05-14 14:25:08 UTC
Answers
Your sample odometry message has no covariance given to it. Therefore, the EKF will give it a very low covariance value, which is probably not accurate. You will have to go into the source code of your odom and IMU drivers and manually chanhge the covariance values so that they're accurate. If they aren't, it can cause the filter to oscillate. Also, since you're fusing the yaw value from both odom and IMU, you will want to have one of them set to relative or differential. Maybe try the latter first and see if it helps.
Asked by chased11 on 2023-05-26 14:05:27 UTC
Comments