The stochastic multi-gradient algorithm for multi-objective optimization and its application to supervised machine learning

Optimization of conflicting functions is of paramount importance in decision making, and real world applications frequently involve data that is uncertain or unknown, resulting in multi-objective optimization (MOO) problems of stochastic type. We study the stochastic multi-gradient (SMG) method, seen as an extension of the classical stochastic gradient method for single-objective optimization. At each iteration … Read more

Geometric and Metric Characterizations of Transversality Properties

This paper continues the study of ‘good arrangements’ of collections of sets near a point in their intersection. Our aim is to clarify the relations between various quantitative geometric and metric characterizations of the transversality properties of collections of sets and the corresponding regularity properties of set-valued mappings. We expose all the parameters involved in … Read more

Nonlinear Transversality Properties of Collections of Sets: Dual Space Necessary Characterizations

This paper continues the study of ‘good arrangements’ of collections of sets in normed vector spaces near a point in their intersection. Our aim is to study general nonlinear transversality properties. We focus on dual space (subdifferential and normal cone) necessary characterizations of these properties. As an application, we provide dual necessary and sufficient conditions … Read more

Optimal K-Thresholding Algorithms for Sparse Optimization Problems

The simulations indicate that the existing hard thresholding technique independent of the residual function may cause a dramatic increase or numerical oscillation of the residual. This inherit drawback of the hard thresholding renders the traditional thresholding algorithms unstable and thus generally inefficient for solving practical sparse optimization problems. How to overcome this weakness and develop … Read more

Single-Forward-Step Projective Splitting: Exploiting Cocoercivity

This work describes a new variant of projective splitting for monotone inclusions, in which cocoercive operators can be processed with a single forward step per iteration. This result establishes a symmetry between projective splitting algorithms, the classical forward backward splitting method (FB), and Tseng’s forward-backward-forward method (FBF). Another symmetry is that the new procedure allows … Read more

Asynchronous Stochastic Subgradient Methods for General Nonsmooth Nonconvex Optimization

Asynchronous distributed methods are a popular way to reduce the communication and synchronization costs of large-scale optimization. Yet, for all their success, little is known about their convergence guarantees in the challenging case of general non-smooth, non-convex objectives, beyond cases where closed-form proximal operator solutions are available. This is all the more surprising since these … Read more

Acceleration of SVRG and Katyusha X by Inexact Preconditioning

Empirical risk minimization is an important class of optimization problems with many popular machine learning applications, and stochastic variance reduction methods are popular choices for solving them. Among these methods, SVRG and Katyusha X (a Nesterov accelerated SVRG) achieve fast convergence without substantial memory requirement. In this paper, we propose to accelerate these two algorithms … Read more

Beyond Alternating Updates for Matrix Factorization with Inertial Bregman Proximal Gradient Algorithms

Matrix Factorization is a popular non-convex objective, for which alternating minimization schemes are mostly used. They usually suffer from the major drawback that the solution is biased towards one of the optimization variables. A remedy is non-alternating schemes. However, due to a lack of Lipschitz continuity of the gradient in matrix factorization problems, convergence cannot … Read more

On the Relation between the Extended Supporting Hyperplane Algorithm and Kelley’s Cutting Plane Algorithm

Recently, Kronqvist et al.rediscovered the supporting hyperplane algorithm of Veinott and demonstrated its computational benefits for solving convex mixed-integer nonlinear programs. In this paper we derive the algorithm from a geometric point of view. This enables us to show that the supporting hyperplane algorithm is equivalent to Kelley’s cutting plane algorithm applied to a particular … Read more

Numerical solution of generalized minimax problems

This contribution contains the description and investigation of four numerical methods for solving generalized minimax problems, which consists in the minimization of functions which are compositions of special smooth convex functions with maxima of smooth functions (the most important problem of this type is the sum of maxima of smooth functions). Section~1 is introductory. In … Read more