Abstract:
In this paper, we quantitatively evaluate how sampling decreases the detectability of anomalous traffic. We build equations to calculate the false positive ratio (FPR) an...Show MoreMetadata
Abstract:
In this paper, we quantitatively evaluate how sampling decreases the detectability of anomalous traffic. We build equations to calculate the false positive ratio (FPR) and false negative ratio (FNR) for given values of the sampling rate, statistics of normal traffic, and volume of anomalies to be detected. We show that by changing the measurement granularity, we can detect anomalies even with a low sampling rate and give the equation to derive optimal granularity by using the relationship between the mean and variance of aggregated flows. With those equations, we can answer for the practical questions that arise in actual network operations; what sampling rate to set in order to find the given volume of anomaly, or, if the sampling is too high for actual operation, then what granularity is optimal to find the anomaly for a given lower limit of sampling rate.
Published in: 2007 IEEE Global Internet Symposium
Date of Conference: 11-11 May 2007
Date Added to IEEE Xplore: 17 September 2007
Print ISBN:978-1-4244-1697-4