EKF with ZED
I want to fuse the ZED pose with an external GPS. For a start, I am just trying to pass the ZED signal through an EKF (robotlocalization) node (I expect to see the same behavior as without the EKF). Therefore, I am disabling "publishmaptf" (map -> odom), and run an EKF node that subscribes to the pose topic. Although I expect to see no change in behavior, eventually my baselink drifts, as if it has velocity even when I stay in the same place.
my ZED2 parameters are:
# params/common.yaml
# Common parameters to Stereolabs ZED and ZED mini cameras
---
# Dynamic parameters cannot have a namespace
brightness: 4 # Dynamic
contrast: 4 # Dynamic
hue: 0 # Dynamic
saturation: 4 # Dynamic
sharpness: 4 # Dynamic
gamma: 8 # Dynamic - Requires SDK >=v3.1
auto_exposure_gain: true # Dynamic
gain: 100 # Dynamic - works only if `auto_exposure_gain` is false
exposure: 100 # Dynamic - works only if `auto_exposure_gain` is false
auto_whitebalance: true # Dynamic
whitebalance_temperature: 42 # Dynamic - works only if `auto_whitebalance` is false
depth_confidence: 50 # Dynamic
depth_texture_conf: 100 # Dynamic
pub_frame_rate: 15.0 # Dynamic - frequency of publishing of video and depth data
point_cloud_freq: 30.0 #15 # Dynamic - frequency of the pointcloud publishing (equal or less to `grab_frame_rate` value)
general:
camera_name: zed # A name for the camera (can be different from camera model and node name and can be overwritten by the launch file)
zed_id: 0
serial_number: 0
resolution: 3 # '0': HD2K, '1': HD1080, '2': HD720, '3': VGA
grab_frame_rate: 30 # Frequency of frame grabbing for internal SDK operations
gpu_id: -1
base_frame: 'base_link' # must be equal to the frame_id used in the URDF file
verbose: false # Enable info message by the ZED SDK
svo_compression: 2 # `0`: LOSSLESS, `1`: AVCHD, `2`: HEVC
self_calib: true # enable/disable self calibration at starting
camera_flip: false
video:
img_downsample_factor: 1.0 # Resample factor for images [0.01,1.0] The SDK works with native image sizes, but publishes rescaled image.
extrinsic_in_camera_frame: true # if `false` extrinsic parameter in `camera_info` will use ROS native frame (X FORWARD, Z UP) instead of the camera frame (Z FORWARD, Y DOWN) [`true` use old behavior as for version < v3.1]
depth:
quality: 1 # '0': NONE, '1': PERFORMANCE, '2': QUALITY, '3': ULTRA
sensing_mode: 0 # '0': STANDARD, '1': FILL (not use FILL for robotic applications)
depth_stabilization: 0 # `0`: disabled, `1`: enabled
openni_depth_mode: false # 'false': 32bit float meters, 'true': 16bit uchar millimeters
depth_downsample_factor: 1.0 # Resample factor for depth data matrices [0.01,1.0] The SDK works with native data sizes, but publishes rescaled matrices (depth map, point cloud, ...)
pos_tracking:
pos_tracking_enabled: true # True to enable positional tracking from start
publish_tf: true # publish `odom -> base_link` TF
publish_map_tf: false # publish `map -> odom` TF
map_frame: 'map'
odometry_frame: 'odom'
area_memory_db_path: ''
area_memory: true # Enable to detect loop closure
floor_alignment: false # Enable to automatically calculate camera/floor offset
initial_base_pose: [0.0,0.0,0.0, 0.0,0.0,0.0] # Initial position of the `base_frame` -> [X, Y, Z, R, P, Y]
init_odom_with_first_valid_pose: true # Enable to initialize the odometry with the first valid pose
path_pub_rate: 2.0 # Camera trajectory publishing frequency
path_max_count: -1 # use '-1' for unlimited path size
two_d_mode: false # Force navigation on a plane. If true the Z value will be fixed to "fixed_z_value", roll and pitch to zero
fixed_z_value: 0.05 # Value to be used for Z coordinate if `two_d_mode` is true
mapping:
mapping_enabled: true # True to enable mapping and fused point cloud pubblication
resolution: 0.01 # maps resolution in meters [0.01f, 0.2f]
max_mapping_range: -1 # maximum depth range while mapping in meters (-1 for automatic calculation) [2.0, 20.0]
fused_pointcloud_freq: 0.5 # frequency of the publishing of the fused colored point cloud
and my robot_localization EKF parameter file:
frequency: 30
silent_tf_failure: false
sensor_timeout: 0.1
two_d_mode: false
transform_time_offset: 0.0
transform_timeout: 0.0
print_diagnostics: true
debug: false
debug_out_file: /path/to/debug/file.txt
publish_tf: true
publish_acceleration: false
permit_corrected_publication: false
map_frame: map # Defaults to "map" if unspecified
odom_frame: odom # Defaults to "odom" if unspecified
base_link_frame: base_link # Defaults to "base_link" if unspecified
world_frame: map # Defaults to the value of odom_frame if unspecified
pose0: zed2/zed_node/pose_with_covariance
pose0_config: [true, true, true,
true, true, true,
false, false, false,
false, false, false,
false, false, false]
pose0_differential: false
pose0_relative: false
pose0_queue_size: 5
#pose0_rejection_threshold: 2 # Note the difference in parameter name
pose0_nodelay: false
use_control: false
# Whether the input (assumed to be cmd_vel) is a geometry_msgs/Twist or geometry_msgs/TwistStamped message. Defaults to
# false.
stamped_control: false
# The last issued control command will be used in prediction for this period. Defaults to 0.2.
control_timeout: 0.2
# Which velocities are being controlled. Order is vx, vy, vz, vroll, vpitch, vyaw.
#control_config: [true, false, false, false, false, true]
# Places limits on how large the acceleration term will be. Should match your robot's kinematics.
#acceleration_limits: [1.3, 0.0, 0.0, 0.0, 0.0, 3.4]
# Acceleration and deceleration limits are not always the same for robots.
#deceleration_limits: [1.3, 0.0, 0.0, 0.0, 0.0, 4.5]
# If your robot cannot instantaneously reach its acceleration limit, the permitted change can be controlled with these
# gains
#acceleration_gains: [0.8, 0.0, 0.0, 0.0, 0.0, 0.9]
# If your robot cannot instantaneously reach its deceleration limit, the permitted change can be controlled with these
# gains
#deceleration_gains: [1.0, 0.0, 0.0, 0.0, 0.0, 1.0]
process_noise_covariance: [0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0.04, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.02, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.015]
initial_estimate_covariance: [1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9]
Asked by amit_z on 2022-08-08 07:10:41 UTC
Answers
A few clarifications first off: The EKF will produce the motion of the robot according to the robot, which is expressed in the odom->base_link
transform. This will typically drift, even in well balanced systems. To combat this drift, an additional transform is introduced, map->odom
, which compensates this drift* through techniques like mapping and loop closing. The ZED is capable of calculating both map->odom
and odom->base_link
transforms, although I would be careful about having one sensor do both. Currently, your ZED publishes odom->base_link
(publish_tf: true
). Turn that off to isolate any error to the EKF alone.
Now, if you still have drift, turn off all inputs to the EKF and restart. Make sure you have no drift here. Turn on one variable at a time (for instance, listen to just x
position) and make sure that works and is stable - in this state, the position of the robot in the odom
. Then, work up to the full suite of variables you intend to listen on. I would recommend to keep the number of inputs to the EKF down to a minimum.
- The robot always starts in the
odom
origin, so if the robot has a map localization, themap->odom
can also place the robot in a map.
Asked by Per Edwardsson on 2022-08-09 02:58:21 UTC
Comments
Thank you for your response.
As explained in the robot_localization doc, usually it is needed to run 2 EKF nodes, one that publishes map -> odom
and one for odom->base_link
. Since ZED already fuses IMU and visual odometry to produce odom->base_link
, and I only want the GPS fusion, I have used only one EKF node (map -> odom
) and did not disable publish_tf
. Don't you think it is a good idea?
Asked by amit_z on 2022-08-09 07:22:38 UTC
The main problem, I believe, is that /odometry/gps
, navsat_transfrom
, odometry msg, is drifting in all directions, x y and z. This is happening even though the GPS signal seems OK (from ublox/fix)
This is the /odometry/gps msg:
header:
seq: 326
stamp:
secs: 1660049500
nsecs: 499717350
frame_id: "map"
child_frame_id: ''
pose:
pose:
position:
x: -4.9935318673960865
y: -8.162464517634362
z: 9.2710430592455
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 1.0
Asked by amit_z on 2022-08-09 07:56:39 UTC
Comments