A Primal-Dual Homotopy Algorithm for l_1-Minimization with l_inf-Constraints

In this paper we propose a primal-dual homotopy method for $\ell_1$-minimization problems with infinity norm constraints in the context of sparse reconstruction. The natural homotopy parameter is the value of the bound for the constraints and we show that there exists a piecewise linear solution path with finitely many break points for the primal problem … Read more

Accelerated first-order methods for large-scale convex minimization

This paper discusses several (sub)gradient methods attaining the optimal complexity for smooth problems with Lipschitz continuous gradients, nonsmooth problems with bounded variation of subgradients, weakly smooth problems with H\”older continuous gradients. The proposed schemes are optimal for smooth strongly convex problems with Lipschitz continuous gradients and optimal up to a logarithmic factor for nonsmooth problems … Read more

Lagrangian relaxation for SVM feature selection

We discuss a Lagrangian-relaxation-based heuristics for dealing with feature selection in a standard L1 norm Support Vector Machine (SVM) framework for binary classification. The feature selection model we adopt is a Mixed Binary Linear Programming problem and it is suitable for a Lagrangian relaxation approach. Based on a property of the optimal multiplier setting, we … Read more

New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure

Motivated by recent work of Renegar, we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f^* = \min_{x \in Q} f(x), where we presume knowledge of a strict lower bound f_slb < f^*. [Indeed, f_slb is naturally ... Read more

A forward-backward dynamical approach to the minimization of the sum of a nonsmooth convex with a smooth nonconvex function

We address the minimization of the sum of a proper, convex and lower semicontinuous with a (possibly nonconvex) smooth function from the perspective of an implicit dynamical system of forward-backward type. The latter is formulated by means of the gradient of the smooth function and of the proximal point operator of the nonsmooth one. The … Read more

On the convergence rate of an inexact proximal point algorithm for quasiconvex minimization on Hadamard manifolds

In this paper we present a rate of convergence analysis of an inexact proximal point algorithm to solve minimization problems for quasiconvex objective functions on Hadamard manifolds. We prove that under natural assumptions the sequence generated by the algorithm converges linearly or superlinearly to a critical point of the problem. ArticleDownload View PDF

An O(1/k) Convergence Rate for the Variable Stepsize Bregman Operator Splitting Algorithm

An earlier paper proved the convergence of a variable stepsize Bregman operator splitting algorithm (BOSVS) for minimizing $\phi(Bu)+H(u)$ where $H$ and $\phi$ are convex functions, and $\phi$ is possibly nonsmooth. The algorithm was shown to be relatively efficient when applied to partially parallel magnetic resonance image reconstruction problems. In this paper, the convergence rate of … Read more

Decomposition algorithm for large-scale two-stage unit-commitment

Everyday, electricity generation companies submit a generation schedule to the grid operator for the coming day; computing an optimal schedule is called the unit-commitment problem. Generation companies can also occasionally submit changes to the schedule, that can be seen as intra-daily incomplete recourse actions. In this paper, we propose a two-stage formulation of unit-commitment, wherein … Read more

An optimal subgradient algorithm with subspace search for costly convex optimization problems

This paper presents an acceleration of the optimal subgradient algorithm OSGA \cite{NeuO} for solving convex optimization problems, where the objective function involves costly affine and cheap nonlinear terms. We combine OSGA with a multidimensional subspace search technique, which leads to low-dimensional problem that can be solved efficiently. Numerical results concerning some applications are reported. A … Read more

A Nonmonotone Approach without Differentiability Test for Gradient Sampling Methods

Recently, optimization problems involving nonsmooth and locally Lipschitz functions have been subject of investigation, and an innovative method known as Gradient Sampling has gained attention. Although the method has shown good results for important real problems, some drawbacks still remain unexplored. This study suggests modifications to the gradient sampling class of methods in order to … Read more