Abstract:
This paper investigates personalized federated learning, in which a group of workers are coordinated by a server to train correlated local models, in addition to a common...Show MoreMetadata
Abstract:
This paper investigates personalized federated learning, in which a group of workers are coordinated by a server to train correlated local models, in addition to a common global model. This distributed statistical learning problem faces two challenges: efficiency of information exchange between the workers and the server, and robustness to potential malicious messages from the so-called Byzantine workers. We propose a projected stochastic block gradient descent method to address the robustness issue. Therein, each regular worker learns in a personalized manner with the aid of the global model, and the server judiciously aggregates the local models via a Huber function-based descent step. To improve communication efficiency, we allow the regular workers to perform multi-steps of local update per communication round. Convergence of the proposed method is established for non-convex personalized federated learning. Numerical experiments on neural network training validate advantages of the proposed method over the existing ones.
Published in: ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 04-10 June 2023
Date Added to IEEE Xplore: 05 May 2023
ISBN Information: