Abstract:
Data imbalance and complexity are the key challenges of applying federated learning (FL) techniques for wireless networks. In this paper, we propose a novel framework ins...Show MoreMetadata
Abstract:
Data imbalance and complexity are the key challenges of applying federated learning (FL) techniques for wireless networks. In this paper, we propose a novel framework inspired by a divide-and-conquer algorithm. We aim to develop a full-stack federated distillation (FFD) method for federated learning over a massive Internet of Things network. We first divide the network into sub-regions that can be represented by a neural network model. After performing local training, these models are then aggregated into a global model by using a novel knowledge-distillation method. This FFD method allows each local model to be efficiently updated by learning the features of the other models. Furthermore, this method can be easily deployed in new and large-scaled environments without requiring the models to be re-trained from scratch. Finally, we conduct extensive simulations to evaluate the performance of the proposed FFD method. The results show that our solution outperforms many contemporary FL techniques with non-IID (i.e., not independent and identically distributed) and imbalanced data.
Date of Conference: 20-22 October 2022
Date Added to IEEE Xplore: 15 November 2022
ISBN Information: