Skip to Main Content
In this paper, we consider the distributed training of a SVM using measurements collected by the nodes of a Wireless Sensor Network in order to achieve global consensus with the minimum possible inter-node communications for data exchange. We derive a novel mathematical characterization for the optimal selection of partial information that neighboring sensors should exchange in order to achieve consensus in the network. We provide a selection function which ranks the training vectors in order of importance in the learning process. The amount of information exchange can vary, based on an appropriately chosen threshold value of this selection function, providing a desired trade-off between classification accuracy and power consumption. Through simulation experiments, we show that the proposed algorithm uses significantly less measurements to achieve a consensus that coincides with the optimal hyperplane obtained using a centralized SVM-based classifier that uses the entire sensor data at a fusion center.