Exact duality in semidefinite programming based on elementary reformulations

In semidefinite programming (SDP), unlike in linear programming, Farkas’ lemma may fail to prove infeasibility. Here we obtain an exact, short certificate of infeasibility in SDP by an elementary approach: we reformulate any equality constrained semidefinite system using only elementary row operations, and rotations. When the system is infeasible, the infeasibility of the reformulated system … Read more

Robust newsvendor problem with autoregressive demand

This paper explores the classic single-item newsvendor problem under a novel setting which combines temporal dependence and tractable robust optimization. First, the demand is modeled as a time series which follows an autoregressive process AR(p), p>= 1. Second, a robust approach to maximize the worst-case revenue is proposed: a robust distribution-free autoregressive forecasting method, which … Read more

How to Convexify the Intersection of a Second Order Cone and a Nonconvex Quadratic

A recent series of papers has examined the extension of disjunctive-programming techniques to mixed-integer second-order-cone programming. For example, it has been shown—by several authors using different techniques—that the convex hull of the intersection of an ellipsoid, $\E$, and a split disjunction, $(l – x_j)(x_j – u) \le 0$ with $l < u$, equals the intersection ... Read more

Gradient Sliding for Composite Optimization

We consider in this paper a class of composite optimization problems whose objective function is given by the summation of a general smooth and nonsmooth component, together with a relatively simple nonsmooth term. We present a new class of first-order methods, namely the gradient sliding algorithms, which can skip the computation of the gradient for … Read more

An inertial Tseng’s type proximal algorithm for nonsmooth and nonconvex optimization problems

We investigate the convergence of a forward-backward-forward proximal-type algorithm with inertial and memory effects when minimizing the sum of a nonsmooth function with a smooth one in the absence of convexity. The convergence is obtained provided an appropriate regularization of the objective satisfies the Kurdyka-\L{}ojasiewicz inequality, which is for instance fulfilled for semi-algebraic functions. Article … Read more

An Efficient Gauss-Newton Algorithm for Symmetric Low-Rank Product Matrix Approximations

We derive and study a Gauss-Newton method for computing a symmetric low-rank product that is the closest to a given symmetric matrix in Frobenius norm. Our Gauss-Newton method, which has a particularly simple form, shares the same order of iteration-complexity as a gradient method when the size of desired eigenspace is small, but can be … Read more