The details of a solver for minimizing a strictly convex quadratic objective function subject to general linear constraints is presented. The method uses a gradient projection algorithm enhanced with subspace acceleration to solve the bound-constrained dual optimization problem. Such gradient projection methods are well-known, but are typically employed to solve the primal problem when only simple bound-constraints are present. The main contributions of this work are three-fold. First, we address the challenges associated with solving the dual problem, which is usually a convex problem even when the primal problem is strictly convex. In particular, for the dual problem, one must efficiently compute directions of infinite descent when they exist, which is precisely when the primal formulation is infeasible. Second, we show how the linear algebra may be arranged to take computational advantage of \emph{sparsity} that is often present in the second-derivative matrix, mostly by showing how sparse updates may be performed for algorithmic quantities. We consider the case that the second-derivative matrix is explicitly available and sparse, and the case when it is available implicitly via a limited memory BFGS representation. Third, we present the details of our Fortran 2003 software package \dqp, which is part of the \galahad{} suite of optimization routines. Numerical tests are performed on quadratic programming problems from the combined \cutest\ and Maros and Meszaros test sets.

## Citation

Johns Hopkins University, Applied Mathematics and Statistics, Baltimore, MD, Technical Report OPT-2016/2b

## Article

View A Dual Gradient-Projection Method for Large-Scale Strictly Convex Quadratic Problems