Explanation on sensor_msgs/LaserScan and time
Hi everyone !
I am currently working on LaserScan messages and I don't quite understand how time related attributes should be filled. If I look at the documentation :
Header header # timestamp in the header is the acquisition time of
# the first ray in the scan.
#
# in frame frame_id, angles are measured around
# the positive Z axis (counterclockwise, if Z is up)
# with zero angle being forward along the x axis
float32 angle_min # start angle of the scan [rad]
float32 angle_max # end angle of the scan [rad]
float32 angle_increment # angular distance between measurements [rad]
float32 time_increment # time between measurements [seconds] - if your scanner
# is moving, this will be used in interpolating position
# of 3d points
float32 scan_time # time between scans [seconds]
float32 range_min # minimum range value [m]
float32 range_max # maximum range value [m]
float32[] ranges # range data [m] (Note: values < range_min or > range_max should be discarded)
float32[] intensities # intensity data [device-specific units]. If your
# device does not provide intensities, please leave
# the array empty.
I understand that scan_time is the total duration of one scan (i.e. duration of my n measurements) and that time_increment is the duration of one measurement (so scan_time = time_increment x n).
But then, I began to play around with this bag file, which replay Laser Scan messages ( https://github.com/mlab-upenn/f1_10_c... ), and I realized that my assumption are not right. By doing :
rostopic echo /scan
I got one of the message:
header:
seq: 40906
stamp:
secs: 1458504920
nsecs: 148172000
frame_id: "laser"
angle_min: -1.57079637051
angle_max: 1.56643295288
angle_increment: 0.00436332309619
time_increment: 1.73611115315e-05
scan_time: 0.0250000003725
range_min: 0.0230000000447
range_max: 60.0
I know I have n=720 measurements, because I watched for ranges size and because:
full_angle/angle_increment = (1.56643295288+1.57079637051) / 0.00436332309619 = 719 (+ 1 for the first element).
But when I do the same with time, I get 2 x n as result:
scan_time/time_increment = 0.0250000003725÷1.73611115315×10^05 = 1440
So why is that ? Am I missing something ?
This bag file seems correct, beacause I managed to reproduce the simulation as the author did in a video ( https://www.youtube.com/watch?v=3C_eR... ).