Convergence rate of a proximal multiplier algorithm for separable convex minimization

The proximal multiplier method with proximal distances (PMAPD) proposed by O. Sarmiento C., E. A. Papa Quiroz and P. R. Oliveira, applied to solve a convex program with separable structure unified the works of Chen and Teboulle (PCPM method), Kyono and Fukushima (NPCPMM) and Auslender and Teboulle (EPDM) and extended the convergence properties for the … Read more

Performance of First- and Second-Order Methods for L1-Regularized Least Squares Problems

We study the performance of first- and second-order optimization methods for l1-regularized sparse least-squares problems as the conditioning and the dimensions of the problem increase up to one trillion. A rigorously defined generator is presented which allows control of the dimensions, the conditioning and the sparsity of the problem. The generator has very low memory … Read more

Smooth Strongly Convex Interpolation and Exact Worst-case Performance of First-order Methods

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of smooth (strongly) convex functions and initial conditions. We develop … Read more

Polynomial Root Radius Optimization with Affine Constraints

The root radius of a polynomial is the maximum of the moduli of its roots (zeros). We consider the following optimization problem: minimize the root radius over monic polynomials of degree $n$, with either real or complex coefficients, subject to $k$ consistent affine constraints on the coefficients. We show that there always exists an optimal … Read more

A Three-Operator Splitting Scheme and its Optimization Applications

Operator splitting schemes have been successfully used in computational sciences to reduce complex problems into a series of simpler subproblems. Since 1950s, these schemes have been widely used to solve problems in PDE and control. Recently, large-scale optimization problems in machine learning, signal processing, and imaging have created a resurgence of interest in operator-splitting based … Read more

Inexact Proximal Point Methods for Quasiconvex Minimization on Hadamard Manifolds

In this paper we present two inexact proximal point algorithms to solve minimization problems for quasiconvex objective functions on Hadamard manifolds. We prove that under natural assumptions the sequence generated by the algorithms are well defined and converge to critical points of the problem. We also present an application of the method to demand theory … Read more

Solving disjunctive optimization problems by generalized semi-infinite optimization techniques

We describe a new possibility to model disjunctive optimization problems as generalized semi-infinite programs. In contrast to existing methods, for our approach neither a conjunctive nor a disjunctive normal form is expected. Applying existing lower level reformulations for the corresponding semi-infinite program we derive conjunctive nonlinear problems without any logical expressions, which can be locally … Read more

Optimality and complexity for constrained optimization problems with nonconvex regularization

In this paper, we consider a class of constrained optimization problems where the feasible set is a general closed convex set and the objective function has a nonsmooth, nonconvex regularizer. Such regularizer includes widely used SCAD, MCP, logistic, fraction, hard thresholding and non-Lipschitz $L_p$ penalties as special cases. Using the theory of the generalized directional … Read more

Nonsmooth Methods for Control Design with Integral Quadratic Constraints

We develop an optimization technique to compute local solutions to synthesis problems subject to integral quadratic constraints (IQCs). We use the fact that IQCs may be transformed into semi-infinite maximum eigenvalue constraints over the frequency axis and approach them via nonsmooth optimization methods. We develop a suitable spectral bundle method and prove its convergence in … Read more

Nonlinear Metric Subregularity

In this article, we investigate nonlinear metric subregularity properties of set-valued mappings between general metric or Banach spaces. We demonstrate that these properties can be treated in the framework of the theory of (linear) error bounds for extended real-valued functions of two variables developed in A. Y. Kruger, Error bounds and metric subregularity, Optimization 64, … Read more