A structured modified Newton approach for solving systems of nonlinear equations arising in interior-point methods for quadratic programming

The focus in this work is interior-point methods for quadratic optimization problems with linear inequality constraints where the system of nonlinear equations that arise are solved with Newton-like methods. In particular, the concern is the system of linear equations to be solved at each iteration. Newton systems give high quality solutions but there is an … Read more

Approximate solution of system of equations arising in interior-point methods for bound-constrained optimization

The focus in this paper is interior-point methods for bound-constrained nonlinear optimization where the system of nonlinear equations that arise are solved with Newton’s method. There is a trade-off between solving Newton systems directly, which give high quality solutions, and solving many approximate Newton systems which are computationally less expensive but give lower quality solutions. … Read more

On the existence of a short pivoting sequence for a linear program

Pivoting methods are of vital importance for linear programming, the simplex method being the by far most well-known. In this paper, a primal-dual pair of linear programs in canonical form is considered. We show that there exists a sequence of pivots, whose length is bounded by the minimum dimension of the constraint matrix, such that … Read more

On tradeoffs between treatment time and plan quality of volumetric-modulated arc therapy with sliding-window delivery

The purpose of this study is to give an exact formulation of optimization of volumetric-modulated arc therapy (VMAT) with sliding-window delivery, and to investigate the plan quality effects of decreasing the number of sliding-window sweeps made on the 360-degree arc for a faster VMAT treatment. In light of the exact formulation, we interpret an algorithm … Read more

On limited-memory quasi-Newton methods for minimizing a quadratic function

The main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give two classes of limited-memory quasi-Newton Hessian approximations that generate search directions parallel to those of the method of preconditioned conjugate gradients, and hence give finite termination on quadratic optimization problems. The Hessian approximations … Read more

Active-Set Methods for Convex Quadratic Programming

Computational methods are proposed for solving a convex quadratic program (QP). Active-set methods are defined for a particular primal and dual formulation of a QP with general equality constraints and simple lower bounds on the variables. In the first part of the paper, two methods are proposed, one primal and one dual. These methods generate … Read more

On the equivalence of the method of conjugate gradients and quasi-Newton methods on quadratic problems

In this paper we state necessary and sufficient conditions for equivalence of the method of conjugate gradients and quasi-Newton methods on a quadratic problem. We show that the set of quasi-Newton schemes that generate parallel search directions to those of the method of conjugate gradients is strictly larger than the one-parameter Broyden family. In addition, … Read more

An elementary proof of linear programming optimality conditions without using Farkas’ lemma

Although it is easy to prove the sufficient conditions for optimality of a linear program, the necessary conditions pose a pedagogical challenge. A widespread practice in deriving the necessary conditions is to invoke Farkas’ lemma, but proofs of Farkas’ lemma typically involve “nonlinear” topics such as separating hyperplanes between disjoint convex sets, or else more … Read more

On solving symmetric systems of linear equations in an unnormalized Krylov subspace framework

In an unnormalized Krylov subspace framework for solving symmetric systems of linear equations, the orthogonal vectors that are generated by a Lanczos process are not necessarily on the form of gradients. Associating each orthogonal vector with a triple, and using only the three-term recurrences of the triples, we give conditions on whether a symmetric system … Read more

On the connection between the conjugate gradient method and quasi-Newton methods on quadratic problems

It is well known that the conjugate gradient method and a quasi-Newton method, using any well-defined update matrix from the one-parameter Broyden family of updates, produce identical iterates on a quadratic problem with positive-definite Hessian. This equivalence does not hold for any quasi-Newton method. We define precisely the conditions on the update matrix in the … Read more