A survey of the S-lemma

In this survey we review the many faces of the S-lemma, a result about the correctness of the S-procedure. The basic idea of this widely used method came from control theory but it has important consequences in quadratic and semidefinite optimization, convex geometry and linear algebra as well. These were active research areas, but as … Read more

Perturbation analysis of second order programming problems

We discuss first and second order optimality conditions for nonlinear second-order cone programming problems, and their relation with semidefinite programming problems. For doing this we extend in an abstract setting the notion of optimal partition. Then we state a characterization of strong regularity in terms of second order optimality conditions. Citation Research Report 5293 (August … Read more

Dual versus primal-dual interior-point methods for linear and conic programming

We observe a curious property of dual versus primal-dual path-following interior-point methods when applied to unbounded linear or conic programming problems in dual form. While primal-dual methods can be viewed as implicitly following a central path to detect primal infeasibility and dual unboundedness, dual methods are implicitly moving {\em away} from the analytic center of … Read more

Complex Quadratic Optimization and Semidefinite Programming

In this paper we study the approximation algorithms for a class of discrete quadratic optimization problems in the Hermitian complex form. A special case of the problem that we study corresponds to the max-3-cut model used in a recent paper of Goemans and Williamson. We first develop a closed-form formula to compute the probability of … Read more

Faster approximation algorithms for packing and covering problems

We adapt a method due to Nesterov so as to obtain an algorithm for solving block-angular fractional packing or covering problems to relative tolerance epsilon, while using a number of iterations that grows polynomially in the size of the problem and whose dependency on epsilon is proportional to 1/epsilon. Citation CORC report TR-2004-09, Computational Optimization … Read more

Interior point methods for large-scale linear programming

We discuss interior point methods for large-scale linear programming, with an emphasis on methods that are useful for problems arising in telecommunications. We give the basic framework of a primal-dual interior point method, and consider the numerical issues involved in calculating the search direction in each iteration, including the use of factorization methods and/or preconditioned … Read more

A NEW SELF-CONCORDANT BARRIER FOR THE HYPERCUBE

In this paper we introduce a new barrier function $\sum\limits_{i=1}^n(2x_i-1)[\ln{x_i}-\ln(1-x_i)]$ to solve the following optimization problem: $\min\,\, f(x)$ subject to: $Ax=b;\;\;0\leq x\leq e$. We show that this function is a $(3/2)n$-self-concordant barrier on the hypercube $[0,1]^n$. We prove that the central path is well defined and that under an additional assumption on the objective function, … Read more

Interior Point Trajectories and a Homogeneous Model for Nonlinear Complementarity Problems over Symmetric Cones

We study the continuous trajectories for solving monotone nonlinear mixed complementarity problems over symmetric cones. While the analysis in Faybusovich (1997) depends on the optimization theory of convex log-barrier functions, our approach is based on the paper of Monteiro and Pang (1998), where a vast set of conclusions concerning continuous trajectories is shown for monotone … Read more

Second-order Cone Programming Methods for Total Variation-based Image Restoration

In this paper we present optimization algorithms for image restoration based on the total variation (TV) minimization framework of L. Rudin, S. Osher and E. Fatemi (ROF). Our approach formulates TV minimization as a second-order cone program which is then solved by interior-point algorithms that are efficient both in practice (using nested dissection and domain … Read more

Sensitivity analysis in linear optimization: Invariant support set intervals

Sensitivity analysis is one of the most nteresting and preoccupying areas in optimization. Many attempts are made to investigate the problem’s behavior when the input data changes. Usually variation occurs in the right hand side of the constraints and/or the objective function coefficients. Degeneracy of optimal solutions causes considerable difficulties in sensitivity analysis. In this … Read more