Robust stochastic optimization with the proximal point method

Standard results in stochastic convex optimization bound the number of samples that an algorithm needs to generate a point with small function value in expectation. In this work, we show that a wide class of such algorithms on strongly convex problems can be augmented with sub-exponential confidence bounds at an overhead cost that is only … Read more

On Inexact Solution of Auxiliary Problems in Tensor Methods for Convex Optimization

In this paper we study the auxiliary problems that appear in p-order tensor methods for unconstrained minimization of convex functions with \nu-Holder continuous pth derivatives. This type of auxiliary problems corresponds to the minimization of a (p+\nu)-order regularization of the pth order Taylor approximation of the objective. For the case p=3, we consider the use … Read more

Dual sufficient characterizations of transversality properties

This paper continues the study of ‘good arrangements’ of collections of sets near a point in their intersection. Our aim is to develop a general scheme for quantitative analysis of several transversality properties within the same framework. We consider a general nonlinear setting and establish dual space (subdifferential and normal cone) sufficient characterizations of transversality … Read more

A simple Newton method for local nonsmooth optimization

Superlinear convergence has been an elusive goal for black-box nonsmooth optimization. Even in the convex case, the subgradient method is very slow, and while some cutting plane algorithms, including traditional bundle methods, are popular in practice, local convergence is still sluggish. Faster variants depend either on problem structure or on analyses that elide sequences of … Read more

Relations Between Abs-Normal NLPs and MPCCs Part 1: Strong Constraint Qualifications

This work is part of an ongoing effort of comparing non-smooth optimization problems in abs-normal form to MPCCs. We study the general abs-normal NLP with equality and inequality constraints in relation to an equivalent MPCC reformulation. We show that kink qualifications and MPCC constraint qualifications of linear independence type and Mangasarian-Fromovitz type are equivalent. Then … Read more

Stochastic algorithms with geometric step decay converge linearly on sharp functions

Stochastic (sub)gradient methods require step size schedule tuning to perform well in practice. Classical tuning strategies decay the step size polynomially and lead to optimal sublinear rates on (strongly) convex problems. An alternative schedule, popular in nonconvex optimization, is called geometric step decay and proceeds by halving the step size after every few epochs. In … Read more

Distributionally robust chance constrained geometric optimization

This paper discusses distributionally robust geometric programs with individual and joint chance constraints. Seven groups of uncertainty sets are considered: uncertainty sets with first two order moments information, uncertainty sets constrained by the Kullback-Leibler divergence distance with a normal reference distribution or a discrete reference distribution, uncertainty sets with known first moments or known first … Read more

Characterizations of explicitly quasiconvex vector functions w.r.t. polyhedral cones

The aim of this paper is to present new characterizations of explicitly cone-quasiconvex vector functions with respect to a polyhedral cone of a finite-dimensional Euclidean space. These characterizations are given in terms of classical explicit quasiconvexity of certain real-valued functions, defined by composing the vector-valued function with appropriate scalarization functions, namely the extreme directions of … Read more

Tensor Methods for Finding Approximate Stationary Points of Convex Functions

In this paper we consider the problem of finding \epsilon-approximate stationary points of convex functions that are p-times differentiable with \nu-Hölder continuous pth derivatives. We present tensor methods with and without acceleration. Specifically, we show that the non-accelerated schemes take at most O(\epsilon^{-1/(p+\nu-1)}) iterations to reduce the norm of the gradient of the objective below … Read more

A family of multi-parameterized proximal point algorithms

In this paper, a multi-parameterized proximal point algorithm combining with a relaxation step is developed for solving convex minimization problem subject to linear constraints. We show its global convergence and sublinear convergence rate from the prospective of variational inequality. Preliminary numerical experiments on testing a sparse minimization problem from signal processing indicate that the proposed … Read more