Abstract:
The hybrid steepest descent method (HSDM) [Yamada, '01] was introduced as a low-computational complexity tool for solving convex variational-inequality problems over the ...Show MoreMetadata
Abstract:
The hybrid steepest descent method (HSDM) [Yamada, '01] was introduced as a low-computational complexity tool for solving convex variational-inequality problems over the fixed-point set of non-expansive mappings in Hilbert spaces. Motivated by results on decentralized optimization, this study introduces an HSDM variant that extends, for the first time, the applicability of HSDM to affinely constrained composite convex minimization tasks over Euclidean spaces; the same class of problems solved by the popular alternating direction method of multipliers and primal-dual methods. The proposed scheme shows desirable attributes for large-scale optimization tasks that have not been met, partly or all-together, in any other member of the HSDM family of algorithms: tunable computational complexity, a step-size parameter which stays constant over recursions, promoting thus acceleration of convergence, no boundedness constraints on iterates and/or gradients, and the ability to deal with convex losses which comprise a smooth and a non-smooth part, where the smooth part is only required to have a Lipschitz-continuous derivative. Convergence guarantees and rates are established. Numerical tests on synthetic data and on colored-image inpainting underline the rich potential of the proposed scheme for large-scale optimization tasks.
Published in: 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 05-09 March 2017
Date Added to IEEE Xplore: 19 June 2017
ISBN Information:
Electronic ISSN: 2379-190X