A conic interior point decomposition approach for large scale semidefinite programming

We describe a conic interior point decomposition approach for solving a large scale semidefinite programs (SDP) whose primal feasible set is bounded. The idea is to solve such an SDP using existing primal-dual interior point methods, in an iterative fashion between a {\em master problem} and a {\em subproblem}. In our case, the master problem … Read more

Postponing the Choice of the Barrier Parameter in Mehrotra-Type Predictor-Corrector Algorithms

In \cite{SPT} the authors considered a variant of Mehrotra’s predictor-corrector algorithm that has been widely used in several IPMs based optimization packages. By an example they showed that this variant might make very small steps in order to keep the iterate in a certain neighborhood of the central path, that itself implies the inefficiency of … Read more

Generalized Support Set Invariancy Sensitivity Analysis

Support set invariancy sensitivity analysis deals with finding the range of the parameter variation where there are optimal solutions with the same positive variables for all parameter values throughout this range. This approach to sensitivity analysis has been studied for Linear Optimization (LO) and Convex Quadratic Optimization (CQO) problems, when they are in standard form. … Read more

New Complexity Analysis of IIPMs for Linear Optimization Based on a Specific Self-Regular Function

Primal-dual Interior-Point Methods (IPMs) have shown their ability in solving large classes of optimization problems efficiently. Feasible IPMs require a strictly feasible starting point to generate the iterates that converge to an optimal solution. The self-dual embedding model provides an elegant solution to this problem with the cost of slightly increasing the size of the … Read more

On Mehrotra-Type Predictor-Corrector Algorithms

In this paper we discuss the polynomiality of Mehrotra-type predictor-corrector algorithms. We consider a variant of the original prototype of the algorithm that has been widely used in several IPM based optimization packages, for which no complexity result is known to date. By an example we show that in this variant the usual Mehrotra-type adaptive … Read more

Sensitivity analysis in convex quadratic optimization: simultaneous perturbation of the objective and right-hand-side vectors

In this paper we study the behavior of Convex Quadratic Optimization problems when variation occurs simultaneously in the right-hand side vector of the constraints and in the coefficient vector of the linear term in the objective function. It is proven that the optimal value function is piecewise-quadratic. The concepts of transition point and invariancy interval … Read more

How good are interior point methods? Klee-Minty cubes tighten iteration-complexity bounds.

By refining a variant of the Klee-Minty example that forces the central path to visit all the vertices of the Klee-Minty n-cube, we exhibit a nearly worst-case example for path-following interior point methods. Namely, while the theoretical iteration-complexity upper bound is O(2^{n}n^{\frac{5}{2}}), we prove that solving this n-dimensional linear optimization problem requires at least $2^n-1$ … Read more

Magnetic Resonance Tissue Density Estimation using Optimal SSFP Pulse-Sequence Design

In this paper, we formulate a nonlinear, nonconvex semidefinite optimization problem to select the steady-state free precession (SSFP) pulse-sequence design variables which maximize the contrast to noise ratio in tissue segmentation. The method could be applied to other pulse sequence types, arbitrary numbers of tissues, and numbers of images. To solve the problem we use … Read more

A survey of the S-lemma

In this survey we review the many faces of the S-lemma, a result about the correctness of the S-procedure. The basic idea of this widely used method came from control theory but it has important consequences in quadratic and semidefinite optimization, convex geometry and linear algebra as well. These were active research areas, but as … Read more

SENSITIVITY ANALYSIS IN CONVEX QUADRATIC OPTIMIZATION: INVARIANT SUPPORT SET INTERVAL

In sensitivity analysis one wants to know how the problem and the optimal solutions change under the variation of the input data. We consider the case when variation happens in the right hand side of the constraints and/or in the linear term of the objective function. We are interested to find the range of the … Read more