Data-Driven Robust Optimization Based on Kernel Learning

We propose piecewise linear kernel-based support vector clustering (SVC) as a new approach tailored to data-driven robust optimization. By solving a quadratic program, the distributional geometry of massive uncertain data can be effectively captured as a compact convex uncertainty set, which considerably reduces conservatism of robust optimization problems. The induced robust counterpart problem retains the … Read more

Infeasibility detection in the alternating direction method of multipliers for convex optimization

The alternating direction method of multipliers is a powerful operator splitting technique for solving structured optimization problems. For convex optimization problems, it is well-known that the algorithm generates iterates that converge to a solution, provided that it exists. If a solution does not exist, then the iterates diverge. Nevertheless, we show that they yield conclusive … Read more

Optimality conditions for minimizers at infinity in polynomial programming

In this paper we study necessary optimality conditions for the optimization problem $$\textrm{infimum}f_0(x) \quad \textrm{ subject to } \quad x \in S,$$ where $f_0 \colon \mathbb{R}^n \rightarrow \mathbb{R}$ is a polynomial function and $S \subset \mathbb{R}^n$ is a set defined by polynomial inequalities. Assume that the problem is bounded below and has the Mangasarian–Fromovitz property … Read more

A logarithmic barrier interior-point method based on majorant functions for second-order cone programming

We present a logarithmic barrier interior-point method for solving a second-order cone programming problem. Newton’s method is used to compute the descent direction, and majorant functions are used as an efficient alternative to line search methods to determine the displacement step along the direction. The efficiency of our method is shown by presenting numerical experiments. … Read more

A primal-dual interior-point method based on various selections of displacement step for second-order cone programming

In this paper, a primal-dual interior-point method equipped with various selections of the displacement step are derived for solving second-order cone programming problems. We first establish the existence and uniqueness of the optimal solution of the corresponding perturbed problem and then demonstrate its convergence to the optimal solution of the original problem. Next, we present … Read more

Analyzing Random Permutations for Cyclic Coordinate Descent

We consider coordinate descent methods on convex quadratic problems, in which exact line searches are performed at each iteration. (This algorithm is identical to Gauss-Seidel on the equivalent symmetric positive definite linear system.) We describe a class of convex quadratic problems for which the random-permutations version of cyclic coordinate descent (RPCD) outperforms the standard cyclic … Read more

A New Voltage Stability-Constrained Optimal Power Flow Model: Sufficient Condition, SOCP Representation, and Relaxation

A simple characterization of the solvability of power flow equations is of great importance in the monitoring, control, and protection of power systems. In this paper, we introduce a sufficient condition for power flow Jacobian nonsingularity. We show that this condition is second-order conic representable when load powers are fixed. Through the incorporation of the … Read more

Regularized Nonlinear Acceleration

We describe a convergence acceleration technique for generic optimization problems. Our scheme computes estimates of the optimum from a nonlinear average of the iterates produced by any optimization method. The weights in this average are computed via a simple linear system, whose solution can be updated online. This acceleration scheme runs in parallel to the … Read more

Sharpness, Restart and Acceleration.

The Lojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problems hold almost generically. Here, we show that sharpness directly controls the performance of restart schemes. The constants quantifying sharpness are of course unobservable, but we show that optimal restart strategies are fairly robust, and searching for the best scheme only increases … Read more

Integration Methods and Accelerated Optimization Algorithms

We show that accelerated optimization methods can be seen as particular instances of multi-step integration schemes from numerical analysis, applied to the gradient flow equation. In comparison with recent advances in this vein, the differential equation considered here is the basic gradient flow and we show that multi-step schemes allow integration of this differential equation … Read more