ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Problem about fusing uwb and odom data with robot_localization

asked 2022-01-07 21:26:30 -0500

HIT_LR gravatar image

updated 2022-01-17 02:38:11 -0500

Tom Moore gravatar image

I would like to fuse several sensors data to locate the robot. When I'm using the great robot_localization package to fuse the uwb and odom data, a problem happens. I need to specify the two topics: the /loc generated from uwb is geometry_msgs/PoseWithCovarianceStamped, while the odom is nav_msgs/Odometry. That is the odometry/filtered output always very similar to (almost equals to) the odom data or uwb data, while my ekf_params.yaml settings are below:

frequency: 10
sensor_timeout: 0.1
two_d_mode: true
transform_time_offset: 0.0
transform_timeout: 0.0
print_diagnostics: true
debug: false
debug_out_file: /home/robot/file1.txt
publish_tf: true
publish_acceleration: false      
odom_frame: odom          
base_link_frame: base_footprint 
world_frame: odom           
odom0: /odom
odom0_config: [true,  true,  false,
               false, false, false,
               false, false, false,
               false, false, false,
               false, false, false]
odom0_queue_size: 2
odom0_nodelay: false
odom0_differential: false
odom0_relative: false
odom0_pose_rejection_threshold: 5
odom0_twist_rejection_threshold: 1

pose0: /loc
pose0_config: [true,  true,  false,
               false, false, false,
               false, false, false,
               false, false, false,
               false, false, false]
pose0_differential: false
pose0_relative: false
pose0_queue_size: 5
pose0_rejection_threshold: 2 
pose0_nodelay: false

use_control: true
stamped_control: false
control_timeout: 0.2
control_config: [true, false, false, false, false, true]
acceleration_limits: [1.3, 0.0, 0.0, 0.0, 0.0, 3.4]
deceleration_limits: [1.3, 0.0, 0.0, 0.0, 0.0, 4.5]
acceleration_gains: [0.8, 0.0, 0.0, 0.0, 0.0, 0.9]
deceleration_gains: [1.0, 0.0, 0.0, 0.0, 0.0, 1.0]

process_noise_covariance: [0.05, 0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0.05, 0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0.06, 0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0.03, 0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0.03, 0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0.06, 0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0.025, 0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0.025, 0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0.04, 0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0.01, 0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0.01, 0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0.02, 0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0.01, 0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0.01, 0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0.015]


initial_estimate_covariance: [1e-1, 0,    0,    0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
                              0,    1e-1, 0,    0,    0,    0,    0,    0,    0,    0 ...
(more)
edit retag flag offensive close merge delete

Comments

Please take a look previous answer: https://answers.ros.org/question/2917...

There are many great suggestions and it will be nice to know which ones you have already tried

osilva gravatar image osilva  ( 2022-01-08 02:54:07 -0500 )edit

What sensor is providing the information that ends up on /odom? And on /loc?

Also, it would help if you explain what uwb is.

Please edit your description to add this. You edit using the "edit" button at the end of the description.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2022-01-08 11:41:34 -0500 )edit

Please include a sample sensor message for each sensor input.

Tom Moore gravatar image Tom Moore  ( 2022-01-14 03:37:24 -0500 )edit

Sorry for my late reply, it seems that I have not edit correctly since I'm new here.

HIT_LR gravatar image HIT_LR  ( 2022-01-15 07:52:19 -0500 )edit

If uwb is a sensor, please provide a link to technical information about it. How often does it provide location data?

Mike Scheutzow gravatar image Mike Scheutzow  ( 2022-01-15 08:20:18 -0500 )edit

The uwb data frequency is 1Hz that set by me arbitrarily. Can be set to 5Hz, 10Hz , 20Hz...... Its error is about 0.05m to ground truth, Any other technical information should I provide?

HIT_LR gravatar image HIT_LR  ( 2022-01-16 02:47:21 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
3

answered 2022-01-17 02:58:05 -0500

Tom Moore gravatar image

There are a bunch of issues here.

  • You are trying to fuse two different sources of absolute pose data, but they don't agree (as in, they aren't in the same frame).

Here's your /odom config:

odom0_config: [true,  true,  false,
               false, false, false,
               false, false, false,
               false, false, false,
               false, false, false]

And here's your /loc config:

pose0_config: [true,  true,  false,
               false, false, false,
               false, false, false,
               false, false, false,
               false, false, false]

Now let's imagine that you receive a measurement on the /odom topic. The robot's (X, Y) position will jump some distance between the previous state and that measurement (how far it jumps depends on their relative covariance). Now you receive a measurement on /loc. That measurements says your (X, Y) position is somewhere totally different, so the filter will jump towards that. You are giving the filter two measurement sources and telling the filter that they are both in the same frame_id (odom) but they aren't.

If /odom is just wheel encoder odometry, then you should fuse the (X, Y) _velocity_ from it, and not the absolute position:

odom0_config: [false,  false,  false,
               false, false, false,
               true, true, false,
               false, false, false,
               false, false, false]
  • In addition to fusing absolutely wheel encoder positions, you are also telling the filter that each measurement has an X and Y variance of 0.001, which is far smaller than the UWB error of 0.01. Wheel encoder odometry error should grow over time (for pose data, not twist/velocity data). But again, you should just fuse the wheel encoder velocities. Even then, though, you should make sure your wheel encoder covariance is accurate.

  • You have two_d_mode on, which is fine, but you aren't measuring orientation (yaw) at all. The filter expects to have a velocity or position reference for X, Y, and Yaw when two_d_mode is enabled. Without it, your yaw variance will explode, and the correlation to the other variables will cause your state to behave erratically. So you need to measure the robot's orientation. Ordinarily, I'd just say to fuse the yaw velocity from the wheel encoders:

    odom0_config: [false,  false,  false,
                   false, false, false,
                   true, true, false,
                   false, false, true,
                   false, false, false]
    

However, in this case, it will create a new problem. Your integrated wheel encoder odometry yaw velocity (and absolute yaw) won't agree with your UWB sensor. In other words, if your UWB sensor says you start at (1, 1), and then the next measurement is (2, 2), then unless your robot is holonomic, the robot's heading should be pi/4 radians. But your wheel encoder odometry might think the robot is facing a totally different direction. So if I were you, I might modify the UWB sensor to estimate heading based on relative UWB measurements, and then fuse X, Y, and Yaw from that sensor.

  • You have too many of the "advanced" parameters specified. You have rejection thresholds specified for your sensors, and you also ...
(more)
edit flag offensive delete link more

Question Tools

3 followers

Stats

Asked: 2022-01-07 21:26:30 -0500

Seen: 195 times

Last updated: Jan 17 '22