Data-Driven Robust Optimization Based on Kernel Learning

We propose piecewise linear kernel-based support vector clustering (SVC) as a new approach tailored to data-driven robust optimization. By solving a quadratic program, the distributional geometry of massive uncertain data can be effectively captured as a compact convex uncertainty set, which considerably reduces conservatism of robust optimization problems. The induced robust counterpart problem retains the … Read more

On the Size of Integer Programs with Bounded Coefficients or Sparse Constraints

Integer programming formulations describe optimization problems over a set of integer points. A fundamental problem is to determine the minimal size of such formulations, in particular, if the size of the coefficients or sparsity of the constraints is bounded. This article considers lower and upper bounds on these sizes both in the original and in … Read more

Robust Stochastic Optimization Made Easy with RSOME

We present a new distributionally robust optimization model called robust stochastic optimization (RSO), which unifies both scenario-tree based stochastic linear optimization and distributionally robust optimization in a practicable framework that can be solved using the state-of-the-art commercial optimization solvers. We also develop a new algebraic modeling package, RSOME to facilitate the implementation of RSO models. … Read more

Analyzing Random Permutations for Cyclic Coordinate Descent

We consider coordinate descent methods on convex quadratic problems, in which exact line searches are performed at each iteration. (This algorithm is identical to Gauss-Seidel on the equivalent symmetric positive definite linear system.) We describe a class of convex quadratic problems for which the random-permutations version of cyclic coordinate descent (RPCD) outperforms the standard cyclic … Read more

A primal-dual interior-point method based on various selections of displacement step for second-order cone programming

In this paper, a primal-dual interior-point method equipped with various selections of the displacement step are derived for solving second-order cone programming problems. We first establish the existence and uniqueness of the optimal solution of the corresponding perturbed problem and then demonstrate its convergence to the optimal solution of the original problem. Next, we present … Read more

A logarithmic barrier interior-point method based on majorant functions for second-order cone programming

We present a logarithmic barrier interior-point method for solving a second-order cone programming problem. Newton’s method is used to compute the descent direction, and majorant functions are used as an efficient alternative to line search methods to determine the displacement step along the direction. The efficiency of our method is shown by presenting numerical experiments. … Read more

Optimality conditions for minimizers at infinity in polynomial programming

In this paper we study necessary optimality conditions for the optimization problem $$\textrm{infimum}f_0(x) \quad \textrm{ subject to } \quad x \in S,$$ where $f_0 \colon \mathbb{R}^n \rightarrow \mathbb{R}$ is a polynomial function and $S \subset \mathbb{R}^n$ is a set defined by polynomial inequalities. Assume that the problem is bounded below and has the Mangasarian–Fromovitz property … Read more

A New Voltage Stability-Constrained Optimal Power Flow Model: Sufficient Condition, SOCP Representation, and Relaxation

A simple characterization of the solvability of power flow equations is of great importance in the monitoring, control, and protection of power systems. In this paper, we introduce a sufficient condition for power flow Jacobian nonsingularity. We show that this condition is second-order conic representable when load powers are fixed. Through the incorporation of the … Read more

Integration Methods and Accelerated Optimization Algorithms

We show that accelerated optimization methods can be seen as particular instances of multi-step integration schemes from numerical analysis, applied to the gradient flow equation. In comparison with recent advances in this vein, the differential equation considered here is the basic gradient flow and we show that multi-step schemes allow integration of this differential equation … Read more

Sharpness, Restart and Acceleration.

The Lojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problems hold almost generically. Here, we show that sharpness directly controls the performance of restart schemes. The constants quantifying sharpness are of course unobservable, but we show that optimal restart strategies are fairly robust, and searching for the best scheme only increases … Read more