How to use Robot Localization package for sensors fusion

asked 2023-05-28 22:57:22 -0600

joewong00 gravatar image

Hi,

I'm new to Robot localization package. After reading through the documentations, I'm still a little vague about how can I properly fuse the sensors, so here are some questions and I hope to get as much information as possible. These are the sensors I'm fusing:

  1. Wheel encoder odom: frame ID odom and publishing at 100 Hz
  2. IMU in ENU convention: frame ID imu_link and publishing at 50Hz
  3. Stereo Camera's odom: frame ID odom and publishing at 15Hz
  4. Stereo Camera's IMU NOT in ENU convention: frame ID zed_imu_link and publishing at 350Hz

These sensors have different frame, different initial values and different publishing rate. How can I make sure all the sensors are aligned (such as having the same absolute heading or pose)? I realised that if I fuse the yaw from different sensors the result is oscillating. Do I need to do a sensors calibration? If yes how can I do that?

Secondly, I am not sure that if I should publish the odom TF in this case since I have a lidar localizer node that performs NDT scan matching and provides the pose and TF straight from map to base_link. How can I make use of robot localization along with NDT matching?

Any help or guidance in regards of this would be much appreciated. Thank you.

edit retag flag offensive close merge delete

Comments

Given the migration to robotics.stackexchange.com, can you please re-post this there and provide the link?

Tom Moore gravatar image Tom Moore  ( 2023-06-19 03:52:03 -0600 )edit