Processing math: 0%
An Iterative Regularized Incremental Projected Subgradient Method for a Class of Bilevel Optimization Problems | IEEE Conference Publication | IEEE Xplore

An Iterative Regularized Incremental Projected Subgradient Method for a Class of Bilevel Optimization Problems


Abstract:

We study a class of bilevel convex optimization problems where the goal is to find the minimizer of an objective function in the upper level, among the set of all optimal...Show More

Abstract:

We study a class of bilevel convex optimization problems where the goal is to find the minimizer of an objective function in the upper level, among the set of all optimal solutions of an optimization problem in the lower level. A wide range of problems in convex optimization can be formulated using this class. An important example is the case where an optimization problem is ill-posed. In this paper, our interest lies in addressing the bilevel problems, where the lower level objective is given as a finite sum of separate nondifferentiable convex component functions. This is the case in a variety of applications in distributed optimization, such as large-scale data processing in machine learning and neural networks. To the best of our knowledge, this class of bilevel problems, with a finite sum in the lower level, has not been addressed before. Motivated by this gap, we develop an iterative regularized incremental subgradient method, where the agents update their iterates in a cyclic manner using a regularized subgradient. Under a suitable choice of the regularization parameter sequence, we establish the convergence of the proposed algorithm and derive a rate of O(1/k0.5-ε) in terms of the lower level objective function for an arbitrary small ε>0. We present the performance of the algorithm on a binary text classification problem.
Date of Conference: 10-12 July 2019
Date Added to IEEE Xplore: 29 August 2019
ISBN Information:

ISSN Information:

Conference Location: Philadelphia, PA, USA

I. Introduction

In this paper, we consider a class of bilevel optimization problems as follows \begin{align*} &\mathrm{minimize}\qquad h(x) \qquad\qquad \qquad \qquad \qquad \qquad (P_{f}^{h})\\ &\text{subject to}\qquad x\in X^{\ast}\triangleq\arg\min_{\mathrm{v}\in X}f(y). \end{align*} where denote the lower and upper level objective functions, respectively, and is a constraint set. This is called the selection problem ([9], [23]) as we are selecting among optimal solutions of a lower level problem, one that minimizes the objective function h. In particular, we consider the case where the lower level objective function is given as , where is the th component function for .

Contact IEEE to Subscribe

References

References is not available for this document.