Abstract:
Federated learning involves a central processor that interacts with multiple agents to determine a global model. The process consists of repeatedly exchanging estimates, ...Show MoreMetadata
Abstract:
Federated learning involves a central processor that interacts with multiple agents to determine a global model. The process consists of repeatedly exchanging estimates, which may end up divulging some private information from the local agents. This scheme can be inconvenient when dealing with sensitive data, and therefore, there is a need for the privatization of the algorithm. Furthermore, the current architecture of a server connected to multiple clients is highly sensitive to communication failures and computational overload at the server. In this work, we develop a private multi-server federated learning scheme, which we call graph federated learning. We use cryptographic and differential privacy concepts to privatize the federated learning algorithm over a graph structure. We further show under convexity and Lipschitz conditions, that the privatized process matches the performance of the non-private algorithm.
Published in: 2021 IEEE 22nd International Workshop on Signal Processing Advances in Wireless Communications (SPAWC)
Date of Conference: 27-30 September 2021
Date Added to IEEE Xplore: 12 November 2021
ISBN Information: