Skip to Main Content
The paper describes a new conjugate gradient algorithm for large scale nonconvex problems with box constraints. In order to speed up the convergence the algorithm employs a scaling matrix which transforms the space of original variables into the space in which Hessian matrices of functionals describing the problems have more clustered eigenvalues. This is done efficiently by applying limited memory BFGS updating matrices. Once the scaling matrix is calculated, the next few iterations of the conjugate gradient algorithms are performed in the transformed space. The box constraints are treated efficiently by the projection. We believe that the preconditioned conjugate gradient algorithm is competitive to the LBFGS-B algorithm. We give some numerical results which support our claim.
Date of Conference: 12-15 Dec. 2005