Convergence of Inexact Forward–Backward Algorithms Using the Forward–Backward Envelope

This paper deals with a general framework for inexact forward–backward algorithms aimed at minimizing the sum of an analytic function and a lower semicontinuous, subanalytic, convex term. Such framework relies on an implementable inexactness condition for the computation of the proximal operator, and a linesearch procedure which is possibly performed whenever a variable metric is … Read more

Zero Order Stochastic Weakly Convex Composite Optimization

In this paper we consider stochastic weakly convex composite problems, however without the existence of a stochastic subgradient oracle. We present a derivative free algorithm that uses a two point approximation for computing a gradient estimate of the smoothed function. We prove convergence at a similar rate as state of the art methods, however with … Read more

A relative-error inertial-relaxed inexact projective splitting algorithm

For solving structured monotone inclusion problems involving the sum of finitely many maximal monotone operators, we propose and study a relative-error inertial-relaxed inexact projective splitting algorithm. The proposed algorithm benefits from a combination of inertial and relaxation effects, which are both controlled by parameters within a certain range. We propose sufficient conditions on these parameters … Read more

A Hybrid Gradient Method for Strictly Convex Quadratic Programming

In this paper, a reliable hybrid algorithm for solving convex quadratic minimization problems is presented. At each iteration, two points are computed: first, an auxiliary point $\dot{x}_k$ is generated by performing a gradient step equipped with an optimal steplength, then, the next iterate $x_{k+1}$ is obtained through a weighted sum of $\dot{x}_k$ with the penultimate … Read more

Sum theorems for maximal monotone operators under weak compactness conditions

This note presents a summary of our most recent results concerning the maximal monotonicity of the sum of two maximal monotone operators defined in a locally convex space under the classical interiority qualification condition when one of their domains or ranges has a weak relative compactness property. CitationNAArticleDownload View PDF

Coordinate Descent Without Coordinates: Tangent Subspace Descent on Riemannian Manifolds

We extend coordinate descent to manifold domains, and provide convergence analyses for geodesically convex and non-convex smooth objective functions. Our key insight is to draw an analogy between coordinate blocks in Euclidean space and tangent subspaces of a manifold. Hence, our method is called tangent subspace descent (TSD). The core principle behind ensuring convergence of … Read more

Optimal Learning for Structured Bandits

We study structured multi-armed bandits, which is the problem of online decision-making under uncertainty in the presence of structural information. In this problem, the decision-maker needs to discover the best course of action despite observing only uncertain rewards over time. The decision- maker is aware of certain structural information regarding the reward distributions and would … Read more

A bundle method for nonsmooth DC programming with application to chance-constrained problems

This work considers nonsmooth and nonconvex optimization problems whose objective and constraint functions are defined by difference-of-convex (DC) functions. We consider an infeasible bundle method based on the so-called improvement functions to compute critical points for problems of this class. Our algorithm neither employs penalization techniques nor solves subproblems with linearized constraints. The approach, which … Read more

Primal Space Necessary Characterizations of Transversality Properties

This paper continues the study of general nonlinear transversality properties of collections of sets and focuses on primal space necessary (in some cases also sufficient) characterizations of the properties. We formulate geometric, metric and slope characterizations, particularly in the convex setting. The Holder case is given a special attention. Quantitative relations between the nonlinear transversality … Read more

Proximal splitting algorithms: Relax them all!

Convex optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, proximal splitting algorithms are particularly adequate: they consist of simple operations, by handling the terms in the objective function separately. We present several existing proximal splitting algorithms and we derive new ones, within a unified framework, which consists … Read more