Adding network delay larger than ROS-rate periode?
Hi!
I have a setup using Noetic where I want to emulate network delay with netimpair (which in itsself utilizes tc and netem) between two nodes.
A talker publishes current time at a rate of 10 and a subscribing listener calculates the delay. Additionally, for debugging purposes, I print the time periode since the last reception (last message callback) to the console. Netimpair is acting to the publishing side of the setup, I checked the port using rosnode info.
Everything works as expected until I set the delay to a value > 1/rate (i.e. 100 ms). When I set the delay to, e.g., 110 ms, the following can be observed:
- The measured delay increases periodically in steps of 10 ms from message to message until it resets at 200 ms (110; 120; 130; ... 200; 110; ...).
- The delay between receptions of the 200 ms and 110 ms messages is close to zero, meaning that two messages are received at the same time.
- The delay between any other msg receptions is 110 ms (which is slower than the messages are actually sent, but on average with the zero-delay message, this corresponds to a rate of 10)
- No messages are lost if the queue is not set to 1
I am suspecting that the behavior stems from the way rostcp is handling messages, but so far I lag deeper understanding and can't explain this unexpected behavior. It seems like when a delayed TCP packet has not been sent yet, ROS just packs another message onto the packet in the delay queue.
Can somebody explain this or, even more appreciated, give a hint on how to achieve the expected delay independently of the rosrate? Thanks!