Skip to Main Content
In information theory, the relative entropy (or information divergence or information distance) quantifies the difference between information flows with various probability distributions. In this study, the authors first resolve the asymmetric property of Renyi divergence and Kullback-Leibler divergence and convert the divergence measures into proper metrics. Then the authors propose an effective metric to detect distributed denial-of-service attacks effectively using the Renyi divergence to measure the difference between legitimate flows and attack flows in a network. With the proposed metric, the authors can obtain the optimal detection sensitivity and the optimal information distance between attack flows and legitimate flows by adjusting the orderacutes value of the Renyi divergence. The experimental results show that the proposed metric can clearly enlarge the adjudication distance, therefore it not only can detect attacks early but also can reduce the false positive rate sharply compared with the use of the traditional Kullback-Leibler divergence and distance approaches.
Date of Publication: December 2009