Abstract:
Federated learning (FL) allows local clients to train a global model by cooperating with a server while ensuring that their raw data is not revealed. However, most existi...Show MoreMetadata
Abstract:
Federated learning (FL) allows local clients to train a global model by cooperating with a server while ensuring that their raw data is not revealed. However, most existing works usually choose clients randomly, regardless of their capabilities and contributions to training. Additionally, FL client selection mechanisms concentrate on a significant challenge associated with system or statistical heterogeneity. This paper tries to manage both the system and statistical heterogeneity of distributed clients in the networks. First, to manage the system heterogeneity, an optimization objective is first proposed to maximize the number of clients with similar capabilities such as storage, computational, and communication capabilities. Then, a network framework with a logical layer is proposed to logically group similar clients by checking their capabilities. Finally, to manage the statistical heterogeneity among clients, a novel Contribution-based Dynamic Federated training strategy, called CDFed, is designed to dynamically adjust the probability of clients being chosen based on Shapley values in each global round. Experimental results on two baseline datasets: MNIST and FMNIST, demonstrate that our proposal has a faster convergence rate, about 50%, and a higher average test accuracy, at least 1%, than baselines in most cases.
Date of Conference: 08-12 May 2023
Date Added to IEEE Xplore: 21 June 2023
ISBN Information: