Loading [a11y]/accessibility-menu.js
Differentially Private Federated Learning with Drift Control | IEEE Conference Publication | IEEE Xplore

Differentially Private Federated Learning with Drift Control


Abstract:

In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. More specifically, users collaborate with the par...Show More

Abstract:

In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. More specifically, users collaborate with the parameter server (PS) to jointly train a machine learning model using their local datasets that are non-i.i.d. across users. The PS is assumed to be honest-but-curious so that the data at users need to be kept private from the PS. More specifically, interactions between the PS and users must satisfy differential privacy (DP) for each user. In this work, we propose a differentially private mechanism that simultaneously deals with user-drift caused by non-i.i.d. data and the randomized user participation in the training process. Specifically, we study SCAFFOLD, a popular federated learning algorithm, that has shown better performance on dealing with non-i.i.d. data than previous federated averaging algorithms. We study the convergence rate of SCAFFOLD under differential privacy constraint. Our convergence results take into account time-varying perturbation noises used by the users, and data and user sampling. We propose two time-varying noise allocation schemes in order to achieve better convergence rate and satisfy a total DP privacy budget. We also conduct experiments to confirm our theoretical findings on real world dataset.
Date of Conference: 09-11 March 2022
Date Added to IEEE Xplore: 14 April 2022
ISBN Information:
Conference Location: Princeton, NJ, USA

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.