Ask Your Question

Revision history [back]

robot_localization uses the time stamps in the headers of its input messages. There is no use of input frequency in timing. At a high level, the order of operations is:

  1. Do a ros::spinOnce(), let all the callbacks fire.
  2. For every message that we receive, stick it in a priority queue based on its timestamp, with earlier stamps coming first
  3. After we've enqueued all messages, we start going through the queue. We predict from the last update time to the measurement's time stamp, then correct. We then pull the next measurement from the queue, predict up to its time stamp, then correct. We repeat this until the queue is empty.

Whether your bag is played back with a rate of 0.1 or 30, the output should be the same. You can verify this by tweaking some of the test launch files in the source code under the test directory and watching the output. Also, see the source code here:

...and here:

Are you sure you're running with use_sim_time set to true? It might be worth double-checking your timing logic for the bag file generation.