Loading [MathJax]/extensions/MathMenu.js
On Decentralized Learning with Stochastic Subspace Descent | IEEE Conference Publication | IEEE Xplore

On Decentralized Learning with Stochastic Subspace Descent


Abstract:

This work considers a high-dimensional decentralized optimization problem where computing the full gradient is prohibitively expensive. This is a common issue in Partial ...Show More

Abstract:

This work considers a high-dimensional decentralized optimization problem where computing the full gradient is prohibitively expensive. This is a common issue in Partial Difference Equation (PDE)-constrained optimization and some machine learning applications. Stochastic subspace descent (SSD) solves this problem in a single-agent setting by computing the projection of gradients on random low-dimensional subspaces. We study this problem in a more challenging decentralized setting, where individual agents can only access their local objective losses. We propose VR-DSSD-GT, a novel variance reductionbased extension of SSD, to overcome this issue. Variance reduction (VR) helps achieve accelerated convergence, while gradient tracking (GT) facilitates exact convergence to the global solution, regardless of the heterogeneity across local objectives. With smooth and strongly convex loss functions, our algorithm achieves linear convergence to the solution. Our results generalize the existing results for gradient-based methods to the broader class of subspace-based methods. Experimental results corroborate and complement our theoretical findings.
Date of Conference: 06-11 April 2025
Date Added to IEEE Xplore: 07 March 2025
ISBN Information:

ISSN Information:

Conference Location: Hyderabad, India

Contact IEEE to Subscribe

References

References is not available for this document.