Abstract:
Federated learning (FL) is a promising distributed machine learning scheme where multiple clients collaborate by sharing a common learning model while maintaining their p...Show MoreMetadata
Abstract:
Federated learning (FL) is a promising distributed machine learning scheme where multiple clients collaborate by sharing a common learning model while maintaining their private data locally. It can be applied to a lot of applications, e.g., training an automatic driving system by the perception of multiple vehicles. However, some clients may join the training system dynamically, which affects the stability and accuracy of the learning system a IoT. Meanwhile, data heterogeneity in the FL system exacerbates the above problem further due to imbalanced data distribution. To solve the above problems, we propose a novel FL framework named FedREM (Retain-Expansion and Matching), which guides clients training models by two mechanisms. They are 1) a Retain-Expansion mechanism that can let clients perform local training and extract data characteristics automatically during the training and 2) a Matching mechanism that can ensure new clients quickly adapt to the global model based on matching their data characteristics and adjusting the model accordingly. Results of extensive experiments verify that our FedREM outperforms various baselines in terms of model accuracy, communication efficiency, and system robustness.
Published in: IEEE Transactions on Parallel and Distributed Systems ( Volume: 35, Issue: 7, July 2024)