Mehrotra-type predictor-corrector algorithms revisited

Motivated by a numerical example which shows that a feasible version of Mehrotra’s original predictor-corrector algorithm might be inefficient in practice, Salahi et al., proposed a so-called safeguard based variant of the algorithm that enjoys polynomial iteration complexity while its practical efficiency is preserved. In this paper we analyze the same Mehrotra’s algorithm from a … Read more

Sensitivity analysis in linear semi-infinite programming via partitions

This paper provides sufficient conditions for the optimal value function of a given linear semi-infinite programming problem to depend linearly on the size of the perturbations, when these perturbations are directional, involve either the cost coefficients or the right-hand-side function or both, and they are sufficiently small. Two kinds of partitions are considered. The first … Read more

Finding a point in the relative interior of a polyhedron

A new initialization or `Phase I’ strategy for feasible interior point methods for linear programming is proposed that computes a point on the primal-dual central path associated with the linear program. Provided there exist primal-dual strictly feasible points — an all-pervasive assumption in interior point method theory that implies the existence of the central path … Read more

Primal-dual first-order methods with ${\cal O}(1/\epsilon)$ iteration-complexity for cone programming

In this paper we consider the general cone programming problem, and propose primal-dual convex (smooth and/or nonsmooth) minimization reformulations for it. We then discuss first-order methods suitable for solving these reformulations, namely, Nesterov’s optimal method \cite{Nest83-1,Nest05-1}, Nesterov’s smooth approximation scheme \cite{Nest05-1}, and Nemirovski’s prox-method \cite{Nem05-1}, and propose a variant of Nesterov’s optimal method which has … Read more

Exact regularization of convex programs

The regularization of a convex program is exact if all solutions of the regularized problem are also solutions of the original problem for all values of the regularization parameter below some positive threshold. For a general convex program, we show that the regularization is exact if and only if a certain selection problem has a … Read more

A Simpler and Tighter Redundant Klee-Minty Construction

By introducing redundant Klee-Minty examples, we have previously shown that the central path can be bent along the edges of the Klee-Minty cubes, thus having $2^n-2$ sharp turns in dimension $n$. In those constructions the redundant hyperplanes were placed parallel with the facets active at the optimal solution. In this paper we present a simpler … Read more

Correlative sparsity in primal-dual interior-point methods for LP, SDP and SOCP

Exploiting sparsity has been a key issue in solving large-scale optimization problems. The most time-consuming part of primal-dual interior-point methods for linear programs, second-order cone programs, and semidefinite programs is solving the Schur complement equation at each iteration, usually by the Cholesky factorization. The computational efficiency is greatly affected by the sparsity of the coefficient … Read more

Central path curvature and iteration-complexity for redundant Klee-Minty cubes

We consider a family of linear optimization problems over the n-dimensional Klee-Minty cube and show that the central path may visit all of its vertices in the same order as simplex methods do. This is achieved by carefully adding an exponential number of redundant constraints that forces the central path to take at least 2^n-2 … Read more

Implementation of Warm-Start Strategies in Interior-Point Methods for Linear Programming in Fixed Dimension

We implement several warm-start strategies in interior-point methods for linear programming (LP). We study the situation in which both the original LP instance and the perturbed one have exactly the same dimensions. We consider different types of perturbations of data components of the original instance and different sizes of each type of perturbation. We modify … Read more

Representing the space of linear programs as a Grassmannian

We represent the space of linear programs as the space of projection matrices. Projection matrices of the same dimension and rank comprise a Grassmannian, which has rich geometric and algebraic structures. An ordinary differential equation on the space of projection matrices defines a path for each projection matrix associated with a linear programming instance and … Read more