Loading web-font TeX/Caligraphic/Regular
On the Convergence of Decentralized Stochastic Gradient Descent With Biased Gradients | IEEE Journals & Magazine | IEEE Xplore

On the Convergence of Decentralized Stochastic Gradient Descent With Biased Gradients


Abstract:

Stochastic optimization algorithms are widely used to solve large-scale machine learning problems. However, their theoretical analysis necessitates access to unbiased est...Show More

Abstract:

Stochastic optimization algorithms are widely used to solve large-scale machine learning problems. However, their theoretical analysis necessitates access to unbiased estimates of the true gradients. To address this issue, we perform a comprehensive convergence rate analysis of stochastic gradient descent (SGD) with biased gradients for decentralized optimization. In non-convex settings, we show that for decentralized SGD utilizing biased gradients, the gradient in expectation is bounded asymptotically at a rate of \mathcal{O}(1/\sqrt{nT}+n/T), and the bound is linearly correlated to the biased gradient gap. In particular, we can recover the convergence results in the unbiased stochastic gradient setting when the biased gradient gap is zero. Lastly, we provide empirical support for our theoretical findings through extensive numerical experiments.
Published in: IEEE Transactions on Signal Processing ( Volume: 73)
Page(s): 549 - 558
Date of Publication: 20 January 2025

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.