Robot localization - With IMU stable; adding other pose sources(eg. AMCL) causes seizures.
On the quest to stop the robot from drifting we have implemented an IMU which makes use of a madgwick filter. After painstaking troubleshooting with both the arduino and the raspberry pi used to read and convert the data for the ROS core to use, we've gotten it to work and display somewhat stable within rviz, until we started driving the robot. The main issue, which has been resoled is that on rotation, the robot within rviz starts shaking which gets heavier the more we rotate it, yet when we return to its original position the shaking would stop but the position is rotated from its original setting.
The remaining issue is that when adding the laserscanmatcher's position to the robot_localization
mix would cause the robot within rviz to warp to and fro at a high rate, incrementing wrong/concurrent position data until the robot's off the map completely. Adding amcl
's postion data causes an epileptic seizure to both rviz and the robot's positioning data, luckily not to ourselves.
The IMU has been properly reconfigured as per the last edit(s), however I can't seem to find the cause for the concurrency between the different posing algorithms nor can I discover the right method of implementing one or two robot_localization
nodes as suggested in some other answers found on the community.
In short, the question would be how I would implement both laser_scan_matcher
for positioning via LIDAR (original method) and/or amcl
without them and the IMU fighting against each other?
The necessary configuration files are as follows, along with the frames.pdf
that tf
provided:
navigation.launch:
<node if="$(eval sim == 'true')" pkg="rosbag" type="play" name="play" args="$(arg path)/bags/t5_lidar.bag --delay=5 --clock"/>
<node pkg="tf" type="static_transform_publisher" name="base_link_to_laser" args="0.15 0.0 0.0 0.0 0.0 0.0 /base_link /laser 40" />
<node pkg="tf" type="static_transform_publisher" name="base_link_to_imu" args="-0.12 0.23 0.0 0.0 0.0 0.0 /base_link /imu_base 40" />
<node pkg="tf" type="static_transform_publisher" name="world_to_map" args="0.0 0 0 0.0 0.0 0.0 /world /map 40"/>
<node pkg="tf" type="static_transform_publisher" name="world_to_map_nav" args="0.0 0 0 0.0 0.0 0.0 /world /map_nav 40"/>
<!--
<node pkg="laser_filters" type="scan_to_scan_filter_chain" name="laser_filter">
<rosparam command="load" file="$(arg path)/config/laser_filter_config.yaml" />
</node>
-->
<node pkg="rviz" type="rviz" name="rviz" args="-d $(arg path)/navigation.rviz"/>
<node pkg="laser_scan_matcher" type="laser_scan_matcher_node" name="laser_scan_matcher_node" output="screen">
<rosparam file="$(arg path)/config/laser_scan_matcher_params.yaml" command="load" />
<remap from="scan" to="scan_filtered" />
</node>
<node pkg="map_server" type="map_server" name="map_server" args="$(arg path)/maps/T5_full_small.yaml" output="screen"/>
<node pkg="map_server" type="map_server" name="map_server_nav" args="$(arg path)/maps/T5_full_small_move_base.yaml" output="screen">
<remap from="map" to="map_nav"/>
</node>
<node pkg="robot_localization" type="ekf_localization_node" name="ekf_odom" clear_params="true">
<rosparam command="load" file="$(arg path)/config/robot_localization_params.yaml" />
</node>
<node pkg="amcl" type="amcl" name="amcl" output="screen">
<rosparam file="$(arg path)/config/amcl_params.yaml" command="load" />
<remap from="scan" to="scan_filtered" />
</node>
<node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
<rosparam file="$(arg path)/config/costmap_common_params.yaml" command="load" ns="global_costmap" />
<rosparam file="$(arg path)/config/costmap_common_params.yaml" command="load" ns="local_costmap" />
<rosparam file="$(arg path)/config/local_costmap_params.yaml" command="load" />
<rosparam file="$(arg path)/config/global_costmap_params.yaml" command="load" />
<rosparam file="$(arg path)/config/base_local_planner_params.yaml" command="load" />
<param name="base_local_planner" value="teb_local_planner/TebLocalPlannerROS" />
<param name="controller_frequency" value="5.0" />
</node>
</launch>
robotlocalizationparams.yaml:
frequency: 20
sensor_timeout: 0.1
two_d_mode: true
publish_tf: true
publish_acceleration: true
map_frame: map # Defaults to "map" if unspecified
odom_frame: odom # Defaults to "odom" if unspecified
base_link_frame: base_link # Defaults to "base_link" if unspecified
world_frame: odom # Defaults to the value of odom_frame if unspecified
# Odometry input settings
odom0: odom
odom0_config: [true, true, false,
false, false, false,
false, false, false,
false, false, true,
false, false, false]
odom0_queue_size: 2
odom0_nodelay: true
odom0_differential: true
odom0_relative: false
#odom0_pose_rejection_threshold: 5
#odom0_twist_rejection_threshold: 1
# IMU sensor input
imu0: /imu/data
imu0_config: [false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true]
imu0_nodelay: true
imu0_differential: true
imu0_relative: false
imu0_queue_size: 5
imu0_pose_rejection_threshold: 0.01 # Note the difference in parameter names
imu0_twist_rejection_threshold: 0.01 #
imu0_linear_acceleration_rejection_threshold: 0.01 #
imu0_remove_gravitational_acceleration: true
amcl_params.yaml:
transform_tolerance: 0.2
gui_publish_rate: 5.0
laser_max_beams: 30
min_particles: 500
max_particles: 15000
kld_err: 0.05
kld_z: 0.99
odom_model_type: diff
odom_alpha1: 0.0
odom_alpha2: 0.0
odom_alpha3: 0.0
odom_alpha4: 0.0
odom_alpha5: 0.0
laser_z_hit: 0.5
laser_z_short: 0.05
laser_z_max: 0.05
laser_z_rand: 0.5
laser_sigma_hit: 0.2
laser_lambda_short: 0.1
laser_lambda_short: 0.1
laser_model_type: likelihood_field
laser_likelihood_max_dist: 2.0
update_min_d: 0.2
update_min_a: 0.5
odom_frame_id: odom
tf_broadcast: true
resample_interval: 1
transform_tolerance: 0.1
recovery_alpha_slow: 0.0
recovery_alpha_fast: 0.0
laserscanmatcher_params.yaml:
base_frame: base_link
fixed_frame: odom
vel: /cmd_vel
use_cloud_input: false
publish_tf: true
publish_odom: true
publish_pose: false
publish_pose_stamped: true
use_odom: true
use_imu: true
use_vel: true
use_alpha_beta: true
max_iterations: 10
last but not least, frames.pdf:
-- Edit 21/05/19 -- Sensor message outputs:
laserscanmatcher (geometry_msgs/pose_stamped
):
pose:
position:
x: 0.143637594785
y: -0.464695389029
z: 0.0
orientation:
x: 0.0
y: 0.0
z: 0.705418616116
w: 0.708790925476
---
header:
seq: 16011
stamp:
secs: 1558433349
nsecs: 990096818
frame_id: "odom"
pose:
position:
x: 0.142629867206
y: -0.464176164145
z: 0.0
orientation:
x: 0.0
y: 0.0
z: 0.705386865759
w: 0.708822523355
IMU Sensor (MPU9250) (sensor_msgs/Imu
):
header:
seq: 75349
stamp:
secs: 1558433544
nsecs: 761164170
frame_id: "imu_link"
orientation:
x: -0.0121845967804
y: 0.029446108886
z: 0.0100522923438
w: 0.999441550913
orientation_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity:
x: 0.0
y: 0.0
z: 0.0
angular_velocity_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration:
x: 0.0
y: 0.0
z: 0.0
linear_acceleration_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
---
header:
seq: 75350
stamp:
secs: 1558433544
nsecs: 776157932
frame_id: "imu_link"
orientation:
x: -0.0121845967804
y: 0.029446108886
z: 0.0100522923438
w: 0.999441550913
orientation_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity:
x: 0.0
y: 0.0
z: 0.0
angular_velocity_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration:
x: 0.0
y: 0.0
z: 0.0
linear_acceleration_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
---
Asked by halberd on 2019-04-23 03:18:46 UTC
Answers
The primary cause of this issue was a misinterpretation of how robot_localization
should be implemented; Though the package suggests it would "merge" the output of sensors, the output of amcl
isn't reliable enough as it reports coordinates on the map versus the raw coordinates returned by the sensors: amcl
reports a position of X: 20, Y: -60 on the map I've used for example, after giving it an estimated pose within Rviz
. The sensors would only keep track with the difference from the coordinates: X: 0, Y: 0, thus causing the epileptic effect of the robot jumping between those two estimates as both packages kept firm with their estimations.
As such, the original two packages for localization, being laser_scan_matcher
and amcl
were temporarily removed and replaced with two robot_localization
packages. both being fed pose and orientation packets from the LIDAR sensor (via laser_scan_matcher
configured to not publish to tf
) and the IMU sensor (An MPU9250).
Although this still hasn't proven to be the right solution due to the map moving with the robot, it did resolve the "epilepsy" caused by two largely different estimations provided by amcl
and the estimates of the packages (E.g. massive jumping between 0,0 and 20,60) by keeping only one source of estimation.
The big picture hasn't been fully resolved yet, but the lesson learned in this issue is to review the pose
messages returned by the packages and to avoid having two parties report way different estimates; that could be avoided by reviewing the tf
tree to avoid conflicts and using rostopic echo
to review the estimated positions of each package so that they match.
The setup would now replace the laser_scan_matcher
in the tf tree for robot_localization
which fuses the IMU's orientation data with the LIDAR's position estimate.
Asked by halberd on 2019-06-13 02:56:35 UTC
Comments
Though the package suggests it would "merge" the output of sensors
So it does merge them, in that it carries out a weighted average between the current measurement and the current belief state, and generates a new state (which is how all Kalman filters work). But each measurement is (and should be) processed in isolation. The problem is that you were trying to fuse data in two different coordinate frames, but were telling the filter they were the same coordinate frame. So the filter receives measurement 1, and jumps to a pose near that measurement. Then it receives a pose in a different frame in measurement 2, and jumps to that.
If it were me, I'd fuse the pose data from the most accurate pose source, and then (a) fuse velocity data from the other source, if available, or (b) turn on the differential
parameter for that source, so that it gets converted to velocity data.
Asked by Tom Moore on 2019-07-03 04:09:25 UTC
Comments
so you still need to do this?
Many packages have an (implicit) assumption (based on REP-103) about chirality and axis alignment. If your data doesn't adhere to that, things will go wrong.
Asked by gvdhoorn on 2019-04-23 03:39:09 UTC
We're doing this right now, considering the rotation and linear acceleration is reversed; our foolish mistake. Edit; we've resolved the IMU's goofed axes, thereby eliminating the epileptic rotations. However, the map now skews to the right/down a little bit. A problem from the past come to haunt us again.
Asked by halberd on 2019-04-23 03:42:48 UTC
In a round of experimentation, I've found that adding a secondary ekf node in place of
amcl
(ergo map->odom) would not cause any epileptic effects, but when moving the robot: it does move accurately, but the map is frozen in place. Can't discover why, and at worst it would make it impossible to use the "2D Pose Estimate" button withinrviz
....Tis a monster we're working with, sadly.
Asked by halberd on 2019-04-23 07:07:15 UTC
Please include sample messages from each sensor input. Your EKF config is also very confusing, as you have a lot of things commented out. Off the top of my head, one thing I see straight away is that you are (or were?) fusing absolute pose data from a bunch of different sources.
Anyway, clear out anything in the EKF config that doesn't need to be there, and let's start with a simple setup, and work our way up. You clearly have some issue with conflicting frames, and/or multiple nodes trying to publish the same transform. But I'd need to see all the sensor inputs to comment further.
Asked by Tom Moore on 2019-05-15 04:34:52 UTC
I was indeed fusing the estimated pose from
amcl
(really bad idea.),laser_scan_matcher
and the IMU at once. Since then I've reduced the set-up to what it was beforehand, hooking the IMU sensor tolaser_scan_matcher
instead.The sensor messages will be included asap, I happen to have noticed that the orientation of the laser and imu don't quite match either, so we're working on reducing that difference the best we can.
Asked by halberd on 2019-05-21 05:07:46 UTC
I also note that you have both the laser scan matcher and the EKF publishing to tf. I'm assuming that means you have two odom->base_link publishers. That's going to make lots of things unhappy.
Asked by Tom Moore on 2019-06-06 02:07:09 UTC
Learned that the hard way, so as a result I've set
publish_tf
to false in the laser scan matcher to have EKF be the only one. One gripe is that I was also feeding the position estimate from the laser scan matcher into EKF at a point. In later experiments I've found that the laser was often publishing position estimates that would be accurate (around 0.02) but often jump to values higher than 5. How this is caused is unknown.Asked by halberd on 2019-06-06 02:47:20 UTC
So is this all working for you now?
Asked by Tom Moore on 2019-06-13 02:17:35 UTC
From what I can gather, the current implementation seems to be working proper for a while; then appears to be slightly inaccurate (Which is to be expected of an IMU) but steady progress is being made. Despite a few tiny jumps (perhaps due to the LIDAR being jumbled by light/glass walls) it's safe to say the main issue could be closed since there are no (large) seizures anymore.
Asked by halberd on 2019-06-13 02:38:07 UTC
It would be nice if 'someone' (either @halberd or @Tom Moore ) could post an answer summarising what were the steps that resolved the problems reported by the @halberd.
Asked by gvdhoorn on 2019-06-13 02:39:33 UTC
How would you "smooth" the IMU Data? I'm trying to filter out the abrupt variations present in the IMU Data as I'm using it in transforming subsequent received Point Clouds but results are a bit messy
Asked by dogh on 2019-06-13 13:53:18 UTC
We thought of using a filter function, in our case on the arduino's level (our IMU's wired to that) before the quaternion is calculated, to skip calculating/including the positions that jump past a certain threshold, much like how the LIDAR's data would suddenly jump from being in the 0 - 1 range to values like 5 - 7. Alas, we haven't had much time to implement either one.
Asked by halberd on 2019-06-13 15:03:10 UTC