Target following algorithms for semidefinite programming

We present a target-following framework for semidefinite programming, which generalizes the target-following framework for linear programming. We use this framework to build weighted path-following interior-point algorithms of three distinct flavors: short-step, predictor-corrector, and large-update. These algorithms have worse-case iteration bounds that parallel their counterparts in linear programming. We further consider the problem of finding analytic … Read more

Robust and Data-Driven Optimization: Modern Decision-Making Under Uncertainty

Traditional models of decision-making under uncertainty assume perfect information, i.e., accurate values for the system parameters and specific probability distributions for the random variables. However, such precise knowledge is rarely available in practice, and a strategy based on erroneous inputs might be infeasible or exhibit poor performance when implemented. The purpose of this tutorial is … Read more

Recognizing Underlying Sparsity in Optimization

Exploiting sparsity is essential to improve the efficiency of solving large optimization problems. We present a method for recognizing the underlying sparsity structure of a nonlinear partially separable problem, and show how the sparsity of the Hessian matrices of the problem’s functions can be improved by performing a nonsingular linear transformation in the space corresponding … Read more

Implementation of Warm-Start Strategies in Interior-Point Methods for Linear Programming in Fixed Dimension

We implement several warm-start strategies in interior-point methods for linear programming (LP). We study the situation in which both the original LP instance and the perturbed one have exactly the same dimensions. We consider different types of perturbations of data components of the original instance and different sizes of each type of perturbation. We modify … Read more

Exploiting Equalities in Polynomial Programming

We propose a novel solution approach for polynomial programming problems with equality constraints. By means of a generic transformation, we show that solution schemes for the (typically simpler) problem without equalities can be used to address the problem with equalities. In particular, we propose new solution schemes for mixed binary programs, pure 0-1 quadratic programs, … Read more

Multi-group Support Vector Machines with measurement costs: a biobjective approach

Support Vector Machine has shown to have good performance in many practical classification settings. In this paper we propose, for multi-group classification, a biobjective optimization model in which we consider not only the generalization ability (modelled through the margin maximization), but also costs associated with the features. This cost is not limited to an economical … Read more

Steepest descent method for quasiconvex minimization on Riemannian manifolds

This paper extends the full convergence of the steepest descent algorithm with a generalized Armijo search and a proximal regularization to solve quasiconvex minimization problems defined on complete Riemannian manifolds. Previous convergence results are obtained as particular cases of our approach and some examples in non Euclidian spaces are given. CitationJ. Math. Anal. Appl. 341 … Read more

Detecting relevant variables and interactions for classification in Support Vector Machines

The widely used Support Vector Machine (SVM) method has shown to yield good results in Supervised Classification problems. The Binarized SVM (BSVM) is a variant which is able to automatically detect which variables are, by themselves, most relevant for the classifier. In this work, we extend the BSVM introduced by the authors to a method … Read more

Totally Unimodular Stochastic Programs

We consider totally unimodular stochastic programs, that is, stochastic programs whose extensive-form constraint matrix is totally unimodular. We generalize the notion of total unimodularity to apply to sets of matrics and provide properties of such sets. Using this notion, we give several sufficient conditions for specific classes of problems. When solving such problems using the … Read more

Coordinate and Subspace Optimization Methods for Linear Least Squares with Non-Quadratic Regularization

This work addresses the problem of regularized linear least squares (RLS) with non-quadratic separable regularization. Despite being frequently deployed in many applications, the RLS problem is often hard to solve using standard iterative methods. In a recent work [10], a new iterative method called Parallel Coordinate Descent (PCD) was devised. We provide herein a convergence … Read more