Optimal subgradient algorithms with application to large-scale linear inverse problems

This study addresses some algorithms for solving structured unconstrained convex optimization problems using first-order information where the underlying function includes high-dimensional data. The primary aim is to develop an implementable algorithmic framework for solving problems with multi-term composite objective functions involving linear mappings using the optimal subgradient algorithm, OSGA, proposed by {\sc Neumaier} in \cite{NeuO}. … Read more

OSGA: A fast subgradient algorithm with optimal complexity

This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter (which can be put to zero if only convexity is known). The worst … Read more

Gauge optimization, duality, and applications

Gauge functions significantly generalize the notion of a norm, and gauge optimization, as defined by Freund (1987), seeks the element of a convex set that is minimal with respect to a gauge function. This conceptually simple problem can be used to model a remarkable array of useful problems, including a special case of conic optimization, … Read more

Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization

In this paper, we present a new stochastic algorithm, namely the stochastic block mirror descent (SBMD) method for solving large-scale nonsmooth and stochastic optimization problems. The basic idea of this algorithm is to incorporate the block-coordinate decomposition and an incremental block averaging scheme into the classic (stochastic) mirror-descent method, in order to significantly reduce the … Read more

Nonsmooth Optimization Using Uncontrolled Inexact Information

We consider convex nonsmooth optimization problems whose objective function is known through a (fine) oracle together with some additional (cheap but poor) information – formalized as a second coarse oracle with uncontrolled inexactness. It is the case when the objective function is itself the output of an optimization solver, using a branch-and-bound procedure, or decomposing … Read more

A merit function approach for direct search

In this paper it is proposed to equip direct-search methods with a general procedure to minimize an objective function, possibly non-smooth, without using derivatives and subject to constraints on the variables. One aims at considering constraints, most likely nonlinear or non-smooth, for which the derivatives of the corresponding functions are also unavailable. The novelty of … Read more

A doubly stabilized bundle method for nonsmooth convex optimization

We propose a bundle method for minimizing nonsmooth convex functions that combines both the level and the proximal stabilizations. Most bundle algorithms use a cutting-plane model of the objective function to formulate a subproblem whose solution gives the next iterate. Proximal bundle methods employ the model in the objective function of the subproblem, while level … Read more

Proximal bundle methods in depth: a unified analysis for inexact oracles

The last few years have seen the advent ofa new generation of bundle methods, capable to handle inexact oracles, polluted by “noise”. Proving convergence of a bundle method is never simple and coping with inexact oracles substantially increases the technicalities. Besides, several variants exist to deal with noise, each one needing an ad hoc proof … Read more

Reducing the Number of Function Evaluations in Mesh Adaptive Direct Search Algorithms

The mesh adaptive direct search (MADS) class of algorithms is designed for nonsmooth optimization, where the objective function and constraints are typically computed by launching a time-consuming computer simulation. Each iteration of a MADS algorithm attempts to improve the current best-known solution by launching the simulation at a finite number of trial points. Common implementations … Read more

Nonsmooth cone-constrained optimization with applications to semi-infinite programming

The paper is devoted to the study of general nonsmooth problems of cone-constrained optimization (or conic programming) important for various aspects of optimization theory and applications. Based on advanced constructions and techniques of variational analysis and generalized differentiation, we derive new necessary optimality conditions (in both “exact” and “fuzzy” forms) for nonsmooth conic programs, establish … Read more