A Decision Space Algorithm for Multiobjective Convex Quadratic Integer Optimization

We present a branch-and-bound algorithm for minimizing multiple convex quadratic objective functions over integer variables. Our method looks for efficient points by fixing subsets of variables to integer values and by using lower bounds in the form of hyperplanes in the image space derived from the continuous relaxations of the restricted objective functions. We show … Read more

Two novel gradient methods with optimal step sizes

In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems. The proposed step sizes employ second-order information in order to obtain faster gradient-type methods. Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of … Read more

A Hybrid Gradient Method for Strictly Convex Quadratic Programming

In this paper, a reliable hybrid algorithm for solving convex quadratic minimization problems is presented. At each iteration, two points are computed: first, an auxiliary point $\dot{x}_k$ is generated by performing a gradient step equipped with an optimal steplength, then, the next iterate $x_{k+1}$ is obtained through a weighted sum of $\dot{x}_k$ with the penultimate … Read more

A Delayed Weighted Gradient Method for Strictly Convex Quadratic Minimization

This paper develops an accelerated version of the steepest descent method by a two-step iteration. The new algorithm uses information with delay to define the iterations. Specifically, in the first step, a prediction of the new test point is calculated by using the gradient method with the exact minimal gradient steplength and then, a correction … Read more

Towards an efficient Augmented Lagrangian method for convex quadratic programming

Interior point methods have attracted most of the attention in the recent decades for solving large scale convex quadratic programming problems. In this paper we take a different route as we present an augmented Lagrangian method for convex quadratic programming based on recent developments for nonlinear programming. In our approach, box constraints are penalized while … Read more

A simplicial decomposition framework for large scale convex quadratic programming

In this paper, we analyze in depth a simplicial decomposition like algorithmic framework for large scale convex quadratic programming. In particular, we first propose two tailored strategies for handling the master problem. Then, we describe a few techniques for speeding up the solution of the pricing problem. We report extensive numerical experiments on both real … Read more

Stability and accuracy of Inexact Interior Point methods for convex quadratic programming

We consider primal-dual IP methods where the linear system arising at each iteration is formulated in the reduced (augmented) form and solved approximately. Focusing on the iterates close to a solution, we analyze the accuracy of the so-called inexact step, i.e., the step that solves the unreduced system, when combining the effects of both different … Read more

A recursive semi-smooth Newton method for linear complementarity problems

A primal feasible active set method is presented for finding the unique solution of a Linear Complementarity Problem (LCP) with a P-matrix, which extends the globally convergent active set method for strictly convex quadratic problems with simple bounds proposed by [P. Hungerlaender and F. Rendl. A feasible active set method for strictly convex problems with … Read more

Regularized monotonic regression

Monotonic (isotonic) Regression (MR) is a powerful tool used for solving a wide range of important applied problems. One of its features, which poses a limitation on its use in some areas, is that it produces a piecewise constant fitted response. For smoothing the fitted response, we introduce a regularization term in the MR formulated … Read more

An optimal first order method based on optimal quadratic averaging

In a recent paper, Bubeck, Lee, and Singh introduced a new first order method for minimizing smooth strongly convex functions. Their geometric descent algorithm, largely inspired by the ellipsoid method, enjoys the optimal linear rate of convergence. Motivated by their work, we propose a close variant that iteratively maintains a quadratic global under-estimator of the … Read more