Abstract:
Federated Learning has been widely adopted in privacy-sensitive distributed machine learning. However, in heterogeneous scenarios where significant differences exist in u...Show MoreMetadata
Abstract:
Federated Learning has been widely adopted in privacy-sensitive distributed machine learning. However, in heterogeneous scenarios where significant differences exist in user data distribution, computational capabilities, or network conditions, existing models often underperform. This paper models key metrics such as communication delays and employs an integer programming approach to select participating devices for training intelligently. Additionally, an efficient federated learning algorithm, AdapFed, is proposed. In the early stages of training, AdapFed prioritizes updates from devices with rapidly changing gradients, while later rounds involve a broader range of diverse devices. The algorithm also incorporates a constraint on the average waiting time among device sets throughout the training process. Experimental results on three public datasets demonstrate that, compared to five baseline algorithms, the proposed framework improves model accuracy by 6.4% within the same number of training rounds and reduces the average waiting time among users by 71.4%.
Published in: ICASSP 2025 - 2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 06-11 April 2025
Date Added to IEEE Xplore: 07 March 2025
ISBN Information: