1 | initial version |

The `MedianFilter`

and `MeanFilter`

filters, which are used via the `LaserArrayFilter`

, are quite useful and you could use them in place of doing your own averaging.

You can take care of the noise through various statistical methods ranging from the simple (average a bunch of values over time and/or space) to the hard-but-impresses-your-boss.

A simple approach that sometimes works is to calculate the centre of the range of distances in a wedge (i.e. `closest + (farthest - closest)/2`

) and discard all values that are a certain distance away from that centre. It's kind of like a poor man's quartile filter. A more statistically accurate approach is to discard values that are a certain number of standard deviations from the mean, or to use only the second and third quartiles of the data, based on distance. Essentially what you are doing is trying to find a cluster of similar distances and using that as the "true" value. However, the larger the wedge, the less accurate these sorts of approaches become because you can easily end up ignoring smaller obstacles in the environment, or a corner that only appears in one small part of the wedge. These techniques are really better applied to a single range value over time rather than to a set of range values distributed over an angle.

The *really* fancy way to do something like this is to know the noise model of the sensor. This gives you a statistical probability for where the measurement value is likely to be for a given distance. Although noise models are more commonly used on single-value sensors over time (i.e. you can estimate the actual value by seeing how the measured noisy value jumps around the distribution of possible values), you can still use this information to weight each measurement that goes into the final value for each wedge - especially if you are filtering over time as well. You didn't say what sensor you are using, but since you are using the Turtlebot 3 I'm guessing that it's that cheap little scanner that comes with it? I doubt the maker provides a noise model for that one, unfortunately - and with that sensor I think it's almost certainly noise rather than bugs in your code causing you problems. You would have to construct one yourself by taking lots of careful measurements and accumulating data to process into a statistical model. That's probably a bit more work than is worth it for your task. :)

I think that give your sensor and your relatively simple goal, the best approach would be to filter the noise out either before or after calculating the value for the wedge. You can use a moving window of the data over time and calculate the value of the windowed data using something as simple as averaging each index over the time range. Or you can use a more accurate and advanced approach of applying a statistical filter that operates over time such as a Kalman filter (filtering a noisy signal to estimate the actual value is the canonical application of a Kalman filter). You would need to experiment to find out if you need to do this on the raw data from the sensor or if you can get away with doing your post processing after you calculate the value for each wedge. If you decide to use a Kalman filter then doing it on the sum of the wedge would be more computationally efficient, although the results may not be as good as doing it on each individual range measurement.

By the way, the `LaserScanRangeFilter`

will help you with removing values that are below or above thresholds - although maybe it's overkill if you only have 8 values to check, not 1024.

ROS Answers is licensed under Creative Commons Attribution 3.0 Content on this site is licensed under a Creative Commons Attribution Share Alike 3.0 license.