Stochastic algorithms with geometric step decay converge linearly on sharp functions

Stochastic (sub)gradient methods require step size schedule tuning to perform well in practice. Classical tuning strategies decay the step size polynomially and lead to optimal sublinear rates on (strongly) convex problems. An alternative schedule, popular in nonconvex optimization, is called geometric step decay and proceeds by halving the step size after every few epochs. In … Read more

Distributionally robust chance constrained geometric optimization

This paper discusses distributionally robust geometric programs with individual and joint chance constraints. Seven groups of uncertainty sets are considered: uncertainty sets with first two order moments information, uncertainty sets constrained by the Kullback-Leibler divergence distance with a normal reference distribution or a discrete reference distribution, uncertainty sets with known first moments or known first … Read more

Characterizations of explicitly quasiconvex vector functions w.r.t. polyhedral cones

The aim of this paper is to present new characterizations of explicitly cone-quasiconvex vector functions with respect to a polyhedral cone of a finite-dimensional Euclidean space. These characterizations are given in terms of classical explicit quasiconvexity of certain real-valued functions, defined by composing the vector-valued function with appropriate scalarization functions, namely the extreme directions of … Read more

Tensor Methods for Finding Approximate Stationary Points of Convex Functions

In this paper we consider the problem of finding \epsilon-approximate stationary points of convex functions that are p-times differentiable with \nu-Hölder continuous pth derivatives. We present tensor methods with and without acceleration. Specifically, we show that the non-accelerated schemes take at most O(\epsilon^{-1/(p+\nu-1)}) iterations to reduce the norm of the gradient of the objective below … Read more

A family of multi-parameterized proximal point algorithms

In this paper, a multi-parameterized proximal point algorithm combining with a relaxation step is developed for solving convex minimization problem subject to linear constraints. We show its global convergence and sublinear convergence rate from the prospective of variational inequality. Preliminary numerical experiments on testing a sparse minimization problem from signal processing indicate that the proposed … Read more

The stochastic multi-gradient algorithm for multi-objective optimization and its application to supervised machine learning

Optimization of conflicting functions is of paramount importance in decision making, and real world applications frequently involve data that is uncertain or unknown, resulting in multi-objective optimization (MOO) problems of stochastic type. We study the stochastic multi-gradient (SMG) method, seen as an extension of the classical stochastic gradient method for single-objective optimization. At each iteration … Read more

Geometric and Metric Characterizations of Transversality Properties

This paper continues the study of ‘good arrangements’ of collections of sets near a point in their intersection. Our aim is to clarify the relations between various quantitative geometric and metric characterizations of the transversality properties of collections of sets and the corresponding regularity properties of set-valued mappings. We expose all the parameters involved in … Read more

Nonlinear Transversality Properties of Collections of Sets: Dual Space Necessary Characterizations

This paper continues the study of ‘good arrangements’ of collections of sets in normed vector spaces near a point in their intersection. Our aim is to study general nonlinear transversality properties. We focus on dual space (subdifferential and normal cone) necessary characterizations of these properties. As an application, we provide dual necessary and sufficient conditions … Read more

Optimal K-Thresholding Algorithms for Sparse Optimization Problems

The simulations indicate that the existing hard thresholding technique independent of the residual function may cause a dramatic increase or numerical oscillation of the residual. This inherit drawback of the hard thresholding renders the traditional thresholding algorithms unstable and thus generally inefficient for solving practical sparse optimization problems. How to overcome this weakness and develop … Read more

Single-Forward-Step Projective Splitting: Exploiting Cocoercivity

This work describes a new variant of projective splitting for monotone inclusions, in which cocoercive operators can be processed with a single forward step per iteration. This result establishes a symmetry between projective splitting algorithms, the classical forward backward splitting method (FB), and Tseng’s forward-backward-forward method (FBF). Another symmetry is that the new procedure allows … Read more