ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Robot localization - With IMU stable; adding other pose sources(eg. AMCL) causes seizures.

asked 2019-04-23 03:18:46 -0500

halberd gravatar image

updated 2019-05-21 05:13:41 -0500

On the quest to stop the robot from drifting we have implemented an IMU which makes use of a madgwick filter. After painstaking troubleshooting with both the arduino and the raspberry pi used to read and convert the data for the ROS core to use, we've gotten it to work and display somewhat stable within rviz, until we started driving the robot. The main issue, which has been resoled is that on rotation, the robot within rviz starts shaking which gets heavier the more we rotate it, yet when we return to its original position the shaking would stop but the position is rotated from its original setting.

The remaining issue is that when adding the laser_scan_matcher's position to the robot_localization mix would cause the robot within rviz to warp to and fro at a high rate, incrementing wrong/concurrent position data until the robot's off the map completely. Adding amcl's postion data causes an epileptic seizure to both rviz and the robot's positioning data, luckily not to ourselves.

The IMU has been properly reconfigured as per the last edit(s), however I can't seem to find the cause for the concurrency between the different posing algorithms nor can I discover the right method of implementing one or two robot_localization nodes as suggested in some other answers found on the community.

In short, the question would be how I would implement both laser_scan_matcher for positioning via LIDAR (original method) and/or amcl without them and the IMU fighting against each other?

The necessary configuration files are as follows, along with the frames.pdf that tf provided:

navigation.launch: <launch> <arg name="sim" default="false"/> <arg name="path"/>

<node if="$(eval sim == 'true')" pkg="rosbag" type="play" name="play" args="$(arg path)/bags/t5_lidar.bag --delay=5 --clock"/>

<node pkg="tf" type="static_transform_publisher" name="base_link_to_laser" args="0.15 0.0 0.0 0.0 0.0 0.0 /base_link /laser 40" />
<node pkg="tf" type="static_transform_publisher" name="base_link_to_imu" args="-0.12 0.23 0.0 0.0 0.0 0.0 /base_link /imu_base 40" />


<node pkg="tf" type="static_transform_publisher" name="world_to_map" args="0.0 0 0 0.0 0.0 0.0 /world /map 40"/>
<node pkg="tf" type="static_transform_publisher" name="world_to_map_nav" args="0.0 0 0 0.0 0.0 0.0 /world /map_nav 40"/>

<!-- 
    <node pkg="laser_filters" type="scan_to_scan_filter_chain" name="laser_filter">
        <rosparam command="load" file="$(arg path)/config/laser_filter_config.yaml" />
    </node> 
-->

<node pkg="rviz" type="rviz" name="rviz" args="-d $(arg path)/navigation.rviz"/>

<node pkg="laser_scan_matcher" type="laser_scan_matcher_node" name="laser_scan_matcher_node" output="screen">
    <rosparam file="$(arg path)/config/laser_scan_matcher_params.yaml" command="load" />
    <remap from="scan" to="scan_filtered" />
</node>

<node pkg="map_server" type="map_server" name="map_server" args="$(arg path)/maps/T5_full_small.yaml" output="screen"/>
<node pkg="map_server" type="map_server" name="map_server_nav" args="$(arg path)/maps/T5_full_small_move_base.yaml" output="screen">
    <remap from="map" to="map_nav"/>
</node>

<node pkg="robot_localization" type="ekf_localization_node" name="ekf_odom" clear_params="true">
    <rosparam command="load" file="$(arg path)/config/robot_localization_params.yaml" />
</node ...
(more)
edit retag flag offensive close merge delete

Comments

I am already aware the IMU would still need some tuning (eg. swapping positive/negative axes, rotation et al)

so you still need to do this?

Many packages have an (implicit) assumption (based on REP-103) about chirality and axis alignment. If your data doesn't adhere to that, things will go wrong.

gvdhoorn gravatar image gvdhoorn  ( 2019-04-23 03:39:09 -0500 )edit

We're doing this right now, considering the rotation and linear acceleration is reversed; our foolish mistake. Edit; we've resolved the IMU's goofed axes, thereby eliminating the epileptic rotations. However, the map now skews to the right/down a little bit. A problem from the past come to haunt us again.

halberd gravatar image halberd  ( 2019-04-23 03:42:48 -0500 )edit

In a round of experimentation, I've found that adding a secondary ekf node in place of amcl (ergo map->odom) would not cause any epileptic effects, but when moving the robot: it does move accurately, but the map is frozen in place. Can't discover why, and at worst it would make it impossible to use the "2D Pose Estimate" button within rviz.

...Tis a monster we're working with, sadly.

halberd gravatar image halberd  ( 2019-04-23 07:07:15 -0500 )edit
1

Please include sample messages from each sensor input. Your EKF config is also very confusing, as you have a lot of things commented out. Off the top of my head, one thing I see straight away is that you are (or were?) fusing absolute pose data from a bunch of different sources.

Anyway, clear out anything in the EKF config that doesn't need to be there, and let's start with a simple setup, and work our way up. You clearly have some issue with conflicting frames, and/or multiple nodes trying to publish the same transform. But I'd need to see all the sensor inputs to comment further.

Tom Moore gravatar image Tom Moore  ( 2019-05-15 04:34:52 -0500 )edit

I was indeed fusing the estimated pose from amcl (really bad idea.), laser_scan_matcher and the IMU at once. Since then I've reduced the set-up to what it was beforehand, hooking the IMU sensor to laser_scan_matcher instead.

The sensor messages will be included asap, I happen to have noticed that the orientation of the laser and imu don't quite match either, so we're working on reducing that difference the best we can.

halberd gravatar image halberd  ( 2019-05-21 05:07:46 -0500 )edit
1

I also note that you have both the laser scan matcher and the EKF publishing to tf. I'm assuming that means you have two odom->base_link publishers. That's going to make lots of things unhappy.

Tom Moore gravatar image Tom Moore  ( 2019-06-06 02:07:09 -0500 )edit

Learned that the hard way, so as a result I've set publish_tf to false in the laser scan matcher to have EKF be the only one. One gripe is that I was also feeding the position estimate from the laser scan matcher into EKF at a point. In later experiments I've found that the laser was often publishing position estimates that would be accurate (around 0.02) but often jump to values higher than 5. How this is caused is unknown.

halberd gravatar image halberd  ( 2019-06-06 02:47:20 -0500 )edit

So is this all working for you now?

Tom Moore gravatar image Tom Moore  ( 2019-06-13 02:17:35 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2019-06-13 02:56:35 -0500

halberd gravatar image

The primary cause of this issue was a misinterpretation of how robot_localization should be implemented; Though the package suggests it would "merge" the output of sensors, the output of amcl isn't reliable enough as it reports coordinates on the map versus the raw coordinates returned by the sensors: amcl reports a position of X: 20, Y: -60 on the map I've used for example, after giving it an estimated pose within Rviz. The sensors would only keep track with the difference from the coordinates: X: 0, Y: 0, thus causing the epileptic effect of the robot jumping between those two estimates as both packages kept firm with their estimations.

As such, the original two packages for localization, being laser_scan_matcher and amcl were temporarily removed and replaced with two robot_localization packages. both being fed pose and orientation packets from the LIDAR sensor (via laser_scan_matcher configured to not publish to tf) and the IMU sensor (An MPU9250). Although this still hasn't proven to be the right solution due to the map moving with the robot, it did resolve the "epilepsy" caused by two largely different estimations provided by amcl and the estimates of the packages (E.g. massive jumping between 0,0 and 20,60) by keeping only one source of estimation.

The big picture hasn't been fully resolved yet, but the lesson learned in this issue is to review the pose messages returned by the packages and to avoid having two parties report way different estimates; that could be avoided by reviewing the tf tree to avoid conflicts and using rostopic echo to review the estimated positions of each package so that they match. The setup would now replace the laser_scan_matcher in the tf tree for robot_localization which fuses the IMU's orientation data with the LIDAR's position estimate.

edit flag offensive delete link more

Comments

Though the package suggests it would "merge" the output of sensors

So it _does_ merge them, in that it carries out a weighted average between the current measurement and the current belief state, and generates a new state (which is how all Kalman filters work). But each measurement is (and should be) processed in isolation. The problem is that you were trying to fuse data in two different coordinate frames, but were telling the filter they were the same coordinate frame. So the filter receives measurement 1, and jumps to a pose near that measurement. Then it receives a pose in a different frame in measurement 2, and jumps to that.

If it were me, I'd fuse the pose data from the most accurate pose source, and then (a) fuse velocity data from the other source, if available, or (b) turn on the differential parameter for that source, so ...(more)

Tom Moore gravatar image Tom Moore  ( 2019-07-03 04:09:25 -0500 )edit

Question Tools

5 followers

Stats

Asked: 2019-04-23 03:18:46 -0500

Seen: 1,173 times

Last updated: Jun 13 '19