Impact Statement:FL—a novel and promising distributed machine learning framework—has been shown to degrade in performance considerably when the data at the clients are not independent or ...Show More
Abstract:
Federated learning (FL) has emerged as a means of distributed learning using local data stored at clients with a coordinating server. Recent studies showed that FL can su...Show MoreMetadata
Impact Statement:
FL—a novel and promising distributed machine learning framework—has been shown to degrade in performance considerably when the data at the clients are not independent or have different distributions as is often the case in practice. For this reason, improving the performance of FL in such situations is crucial to its wide deployment. The incremental SL approach described and analyzed in this article is proven to enhance the performance significantly under some conditions, with the help of a small dataset accessible to the server. Thus, this approach, alone or together with other complementary approaches available in the literature, can help alleviate the shortcoming of FL in practice, thereby making FL more widely applicable.
Abstract:
Federated learning (FL) has emerged as a means of distributed learning using local data stored at clients with a coordinating server. Recent studies showed that FL can suffer from poor performance and slower convergence when training data at the clients are not independent and identically distributed (IID). Here, we consider auxiliary server learning (SL) as a complementary approach to improving the performance of FL on non-IID data. Our analysis and experiments show that this approach can achieve significant improvements in both model accuracy and convergence time even when the dataset utilized by the server is small and its distribution differs from that of the clients’ aggregate data. Moreover, experimental results suggest that auxiliary SL delivers benefits when employed together with other techniques proposed to mitigate the performance degradation of FL on non-IID data.
Published in: IEEE Transactions on Artificial Intelligence ( Volume: 5, Issue: 11, November 2024)