Abstract:
Various problems arising in control and data analysis can be formulated as large-scale convex optimization problems with a composite objective structure. Within the black...Show MoreMetadata
Abstract:
Various problems arising in control and data analysis can be formulated as large-scale convex optimization problems with a composite objective structure. Within the black-box optimization framework, such problems are typically solved by using accelerated first-order methods. The celebrated examples of such methods are the Fast Gradient Method and the Accelerated Multistep Gradient Method, designed by using the estimating sequences framework. In this work, we present a new class of estimating sequences, which are constructed by making use of a tighter lower bound on the objective function together with the gradient mapping technique. Based on the newly introduced estimating sequences, we construct a new method, which is also equipped with an efficient line-search strategy that provides robustness to the imperfect knowledge of the Lipschitz constant. Our proposed method enjoys the accelerated convergence rate, and our theoretical results are corroborated by numerical experiments conducted on real-world datasets. The experimental results also demonstrate the robustness of the initialization of the proposed method to the imperfect knowledge of the strong convexity parameter of the objective function.
Published in: 2022 IEEE 61st Conference on Decision and Control (CDC)
Date of Conference: 06-09 December 2022
Date Added to IEEE Xplore: 10 January 2023
ISBN Information: