Abstract:
Nonconvex-concave (NC-C) finite-sum minimax problems have broad applications in decentralized optimization and various machine learning tasks. However, the nonsmooth natu...Show MoreMetadata
Abstract:
Nonconvex-concave (NC-C) finite-sum minimax problems have broad applications in decentralized optimization and various machine learning tasks. However, the nonsmooth nature of NC-C problems makes it challenging to design effective variance reduction techniques. Existing vanilla stochastic algorithms using uniform samples for gradient estimation often exhibit slow convergence rates and require bounded variance assumptions. In this paper, we develop a novel probabilistic variance reduction updating scheme and propose a single-loop algorithm called the probabilistic variance-reduced smoothed gradient descent-ascent (PVR-SGDA) algorithm. The proposed algorithm achieves an iteration complexity of {\mathcal{O}}\left({{\varepsilon ^{ - 4}}}\right), surpassing the best-known rates of stochastic algorithms for NC-C minimax problems and matching the performance of the best deterministic algorithms in this context. Finally, we demonstrate the effectiveness of the proposed algorithm through numerical simulations.
Published in: ICASSP 2025 - 2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 06-11 April 2025
Date Added to IEEE Xplore: 07 March 2025
ISBN Information: