Abstract:
The problem of statistical heterogeneity in Federated Learning has been a major challenge, with existing solutions making unrealistic assumptions about the availability o...Show MoreMetadata
Abstract:
The problem of statistical heterogeneity in Federated Learning has been a major challenge, with existing solutions making unrealistic assumptions about the availability of shared datasets and high bandwidth between clients and the server. Solving this problem is crucial for the success of Federated Learning in real-world scenarios. In this work, we propose a biologically inspired dual-network model FedDual, which mimics how the human brain learns and memorizes the information. The model consists of a neocortical and a hippocampal network similar to those in the human brain. The hippocampal network is comprised by an image classification model, while the neocortical network is a variational auto-encoder responsible for long-term and re-callable memory. In this manner, FedDual uses the neocortical network to generate pseudo-patterns (synthetic data) on the server (global model). This allows for the hippocampal network to be trained with these pseudo-patterns. The dual-network architecture allows devices to share information via the weight updates of the neocortical network to the server without sending the actual data. We compare FedDual against alternatives elsewhere in the literature when applied to widely available datasets. FedDual not only achieves a margin of accuracy improvement over the alternatives, but also converges faster, requiring less communication rounds.
Date of Conference: 18-23 June 2023
Date Added to IEEE Xplore: 02 August 2023
ISBN Information: