Loading [MathJax]/extensions/MathMenu.js
Data-Dependent Convergence for Consensus Stochastic Optimization | IEEE Journals & Magazine | IEEE Xplore

Data-Dependent Convergence for Consensus Stochastic Optimization


Abstract:

We study a distributed consensus-based stochastic gradient descent (SGD) algorithm and show that the rate of convergence involves the spectral properties of two matrices:...Show More

Abstract:

We study a distributed consensus-based stochastic gradient descent (SGD) algorithm and show that the rate of convergence involves the spectral properties of two matrices: The standard spectral gap of a weight matrix from the network topology and a new term depending on the spectral norm of the sample covariance matrix of the data. This data-dependent convergence rate shows that distributed SGD algorithms perform better on datasets with small spectral norm. Our analysis method also allows us to find data-dependent convergence rates as we limit the amount of communication. Spreading a fixed amount of data across more nodes slows convergence; for asymptotically growing datasets, we show that adding more machines can help when minimizing twice-differentiable losses.
Published in: IEEE Transactions on Automatic Control ( Volume: 62, Issue: 9, September 2017)
Page(s): 4483 - 4498
Date of Publication: 17 February 2017

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.