Stochastic model-based minimization under high-order growth

Given a nonsmooth, nonconvex minimization problem, we consider algorithms that iteratively sample and minimize stochastic convex models of the objective function. Assuming that the one-sided approximation quality and the variation of the models is controlled by a Bregman divergence, we show that the scheme drives a natural stationarity measure to zero at the rate $O(k^{-1/4})$. … Read more

An inexact strategy for the projected gradient algorithm in vector optimization problems on variable ordered spaces

Variable order structures model situations in which the comparison between two points depends on a point-to-cone map. In this paper, an inexact projected gradient method for solving smooth constrained vector optimization problems on variable ordered spaces is presented. It is shown that every accumulation point of the generated sequence satisfies the first order necessary optimality … Read more

The automorphism group and the non-self-duality of p-cones

In this paper, we determine the automorphism group of the p-cones (p\neq 2) in dimension greater than two. In particular, we show that the automorphism group of those p-cones are the positive scalar multiples of the generalized permutation matrices that fix the main axis of the cone. Next, we take a look at a problem … Read more

On positive duality gaps in semidefinite programming

We study semidefinite programs (SDPs) with positive duality gaps, i.e., different optimal values in the primal and dual problems. the primal and dual problems differ. These SDPs are considered extremely pathological, they are often unsolvable, and they also serve as models of more general pathological convex programs. We first fully characterize two variable SDPs with … Read more

On the Complexity of Detecting Convexity over a Box

It has recently been shown that the problem of testing global convexity of polynomials of degree four is {strongly} NP-hard, answering an open question of N.Z. Shor. This result is minimal in the degree of the polynomial when global convexity is of concern. In a number of applications however, one is interested in testing convexity … Read more

Convergence Rates for Projective Splitting

Projective splitting is a family of methods for solving inclusions involving sums of maximal monotone operators. First introduced by Eckstein and Svaiter in 2008, these methods have enjoyed significant innovation in recent years, becoming one of the most flexible operator splitting frameworks available. While weak convergence of the iterates to a solution has been established, … Read more

A sparse optimization approach for energy-efficient timetabling in metro railway systems

In this paper we propose a sparse optimization approach to maximize the utilization of regenerative energy produced by baking trains for energy-efficient timetabling in metro railway systems. By introducing the cardinality function and the square of the Euclidean norm function as the objective function, the resulting sparse optimization model can characterize the utilization of the … Read more

Efficient Optimization Algorithms for Robust Principal Component Analysis and Its Variants

Robust PCA has drawn significant attention in the last decade due to its success in numerous application domains, ranging from bio-informatics, statistics, and machine learning to image and video processing in computer vision. Robust PCA and its variants such as sparse PCA and stable PCA can be formulated as optimization problems with exploitable special structures. … Read more

A lower bound on the iterative complexity of the Harker and Pang globalization technique of the Newton-min algorithm for solving the linear complementarity problem

The plain Newton-min algorithm for solving the linear complementarity problem (LCP) 0 ≤ x ⊥ (Mx+q) ≥ 0 can be viewed as an instance of the plain semismooth Newton method on the equational version min(x,Mx+q) = 0 of the problem. This algorithm converges for any q when M is an M-matrix, but not when it … Read more

The Proximal Alternating Minimization Algorithm for two-block separable convex optimization problems with linear constraints

The Alternating Minimization Algorithm (AMA) has been proposed by Tseng to solve convex programming problems with two-block separable linear constraints and objectives, whereby (at least) one of the components of the latter is assumed to be strongly convex. The fact that one of the subproblems to be solved within the iteration process of AMA does … Read more