Abstract:
In this paper, we present both improved convergence guarantees and empirical results for Federated learning (FL) under imperfect communications. The communication scenari...Show MoreMetadata
Abstract:
In this paper, we present both improved convergence guarantees and empirical results for Federated learning (FL) under imperfect communications. The communication scenario in FL is imperfect due to uplink and downlink channel noises. The theoretical analysis indicates that downlink noise has a more significant impact than uplink noise, which is supported by experimental results. Moreover, we propose strategies to control the Signal-to-Noise ratio (SNR) so that a convergence similar to the perfect communication case of FL can be achieved under imperfect communication. The theoretical analysis delivers multifaceted benefits as it is carried for non-convex smooth loss functions and thus provides a more holistic result considering the modern machine learning and deep learning paradigms. Moreover, the analysis is done without making the restrictive assumption of bounded client dissimilarity, which fails even for a simple case of a strongly convex function.
Date of Conference: 29 October 2023 - 01 November 2023
Date Added to IEEE Xplore: 01 April 2024
ISBN Information: