A Framework for Applying Subgradient Methods to Conic Optimization Problems (version 2)

A framework is presented whereby a general convex conic optimization problem is transformed into an equivalent convex optimization problem whose only constraints are linear equations and whose objective function is Lipschitz continuous. Virtually any subgradient method can be applied to solve the equivalent problem. Two methods are analyzed. (In version 2, the development of algorithms … Read more

A polynomial-time descent method for separable convex optimization problems with linear constraints

We propose a polynomial algorithm for a separable convex optimization problem with linear constraints. We do not make any additional assumptions about the structure of the objective function except for polynomial computability. That is, the objective function can be non-differentiable. The running time of our algorithm is polynomial in the the size of the input … Read more

A Nonmonotone Approach without Differentiability Test for Gradient Sampling Methods

Recently, optimization problems involving nonsmooth and locally Lipschitz functions have been subject of investigation, and an innovative method known as Gradient Sampling has gained attention. Although the method has shown good results for important real problems, some drawbacks still remain unexplored. This study suggests modifications to the gradient sampling class of methods in order to … Read more

Second order forward-backward dynamical systems for monotone inclusion problems

We begin by considering second order dynamical systems of the from $\ddot x(t) + \Gamma (\dot x(t)) + \lambda(t)B(x(t))=0$, where $\Gamma: {\cal H}\rightarrow{\cal H}$ is an elliptic bounded self-adjoint linear operator defined on a real Hilbert space ${\cal H}$, $B: {\cal H}\rightarrow{\cal H}$ is a cocoercive operator and $\lambda:[0,+\infty)\rightarrow [0,+\infty)$ is a relaxation function depending … Read more

Activity Identification and Local Linear Convergence of Forward–Backward-type methods

In this paper, we consider a class of Forward–Backward (FB) splitting methods that includes several variants (e.g. inertial schemes, FISTA) for minimizing the sum of two proper convex and lower semi-continuous functions, one of which has a Lipschitz continuous gradient, and the other is partly smooth relatively to a smooth active manifold $\mathcal{M}$. We propose … Read more

Convergence rate of a proximal multiplier algorithm for separable convex minimization

The proximal multiplier method with proximal distances (PMAPD) proposed by O. Sarmiento C., E. A. Papa Quiroz and P. R. Oliveira, applied to solve a convex program with separable structure unified the works of Chen and Teboulle (PCPM method), Kyono and Fukushima (NPCPMM) and Auslender and Teboulle (EPDM) and extended the convergence properties for the … Read more

Parallel Block Coordinate Minimization with Application to Group Regularized Regression

This paper proposes a method for parallel block coordinate-wise minimization for convex functions. Each iteration involves a first phase where n independent minimizations are performed over the n variable blocks, followed by a phase where the results of the first phase are coordinated to obtain the whole variable update. Convergence of the method to the … Read more

Performance of First- and Second-Order Methods for L1-Regularized Least Squares Problems

We study the performance of first- and second-order optimization methods for l1-regularized sparse least-squares problems as the conditioning and the dimensions of the problem increase up to one trillion. A rigorously defined generator is presented which allows control of the dimensions, the conditioning and the sparsity of the problem. The generator has very low memory … Read more

Smooth Strongly Convex Interpolation and Exact Worst-case Performance of First-order Methods

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of smooth (strongly) convex functions and initial conditions. We develop … Read more

Polynomial Root Radius Optimization with Affine Constraints

The root radius of a polynomial is the maximum of the moduli of its roots (zeros). We consider the following optimization problem: minimize the root radius over monic polynomials of degree $n$, with either real or complex coefficients, subject to $k$ consistent affine constraints on the coefficients. We show that there always exists an optimal … Read more