Abstract:
Federated learning (FL) can train a model collaboratively through multiple remote clients without sharing raw data. The challenge of federated learning (FL) is how to dec...Show MoreMetadata
Abstract:
Federated learning (FL) can train a model collaboratively through multiple remote clients without sharing raw data. The challenge of federated learning (FL) is how to decrease network transmissions. This article aims to reduce network traffic by transmitting fewer neural network parameters. We first investigate similarities of different corresponding layers of convolutional neural network (CNN) models in FL, and find that there is a lot of redundant information in its model feature extractors. For this, we propose a communication-efficient federated aggregation algorithm named FedSL (Federated Split Layers) to reduce the communication overhead. Based on the number of global model layers, the FedSL divides client models into groups in the depth dimension. A Max-Min client selection strategy is employed to select participants for each layer. Each client only transfers partial parameters of those layers that are selected, which reduces the number of parameters. FedSL aggregates the global model in each group and concatenates the parameters of all groups according to the order of layers. The experimental results demonstrate that FedSL improves communication efficiency compared to the algorithms (e.g., FedAvg, FedProx, and MOON), decreasing 42% communication cost with VGG-style CNN and 70% with ResNet-9, while maintaining a similar model accuracy with baseline algorithms.
Published in: IEEE Internet of Things Journal ( Volume: 11, Issue: 9, 01 May 2024)