Loading [MathJax]/extensions/MathZoom.js
Byzantine-Robust and Communication-Efficient Personalized Federated Learning | IEEE Conference Publication | IEEE Xplore

Byzantine-Robust and Communication-Efficient Personalized Federated Learning


Abstract:

This paper investigates personalized federated learning, in which a group of workers are coordinated by a server to train correlated local models, in addition to a common...Show More

Abstract:

This paper investigates personalized federated learning, in which a group of workers are coordinated by a server to train correlated local models, in addition to a common global model. This distributed statistical learning problem faces two challenges: efficiency of information exchange between the workers and the server, and robustness to potential malicious messages from the so-called Byzantine workers. We propose a projected stochastic block gradient descent method to address the robustness issue. Therein, each regular worker learns in a personalized manner with the aid of the global model, and the server judiciously aggregates the local models via a Huber function-based descent step. To improve communication efficiency, we allow the regular workers to perform multi-steps of local update per communication round. Convergence of the proposed method is established for non-convex personalized federated learning. Numerical experiments on neural network training validate advantages of the proposed method over the existing ones.
Date of Conference: 04-10 June 2023
Date Added to IEEE Xplore: 05 May 2023
ISBN Information:

ISSN Information:

Conference Location: Rhodes Island, Greece

References

References is not available for this document.