Golden Ratio Algorithms for Variational Inequalities

The paper presents a fully explicit algorithm for monotone variational inequalities. The method uses variable stepsizes that are computed using two previous iterates as an approximation of the local Lipschitz constant without running a linesearch. Thus, each iteration of the method requires only one evaluation of a monotone operator $F$ and a proximal mapping $g$. … Read more

Efficiency of minimizing compositions of convex functions and smooth maps

We consider the problem of minimizing a sum of a convex function and a composition of a convex function with a smooth map. Important examples include exact penalty formulations of nonlinear programs and nonlinear least squares problems with side constraints. The basic algorithm we rely on is the well-known prox-linear method, which in each iteration … Read more

The Rate of Convergence of Augmented Lagrange Method for a Composite Optimization Problem

In this paper we analyze the rate of local convergence of the augmented Lagrange method for solving optimization problems with equality constraints and the objective function expressed as the sum of a convex function and a twice continuously differentiable function. The presence of the non-smoothness of the convex function in the objective requires extensive tools … Read more

An Inexact Spingarn’s Partial Inverse Method with Applications to Operator Splitting and Composite Optimization

We propose and study the iteration-complexity of an inexact version of the Spingarn’s partial inverse method. Its complexity analysis is performed by viewing it in the framework of the hybrid proximal extragradient (HPE) method, for which pointwise and ergodic iteration-complexity has been established recently by Monteiro and Svaiter. As applications, we propose and analyze the … Read more

A Multilevel Proximal Gradient Algorithm for a Class of Composite Optimization Problems

Composite optimization models consist of the minimization of the sum of a smooth (not necessarily convex) function and a non-smooth convex function. Such models arise in many applications where, in addition to the composite nature of the objective function, a hierarchy of models is readily available. It is common to take advantage of this hierarchy … Read more

An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization

Mini-batch optimization has proven to be a powerful paradigm for large-scale learning. However, the state of the art parallel mini-batch algorithms assume synchronous operation or cyclic update orders. When worker nodes are heterogeneous (due to different computational capabilities or different communication delays), synchronous and cyclic operations are inefficient since they will leave workers idle waiting … Read more

Composite Self-concordant Minimization

We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function endowed with a computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new … Read more

Randomized Block Coordinate Non-Monotone Gradient Method for a Class of Nonlinear Programming

In this paper we propose a randomized block coordinate non-monotone gradient (RBCNMG) method for minimizing the sum of a smooth (possibly nonconvex) function and a block-separable (possibly nonconvex nonsmooth) function. At each iteration, this method randomly picks a block according to any prescribed probability distribution and typically solves several associated proximal subproblems that usually have … Read more

On the Complexity Analysis of Randomized Block-Coordinate Descent Methods

In this paper we analyze the randomized block-coordinate descent (RBCD) methods for minimizing the sum of a smooth convex function and a block-separable convex function. In particular, we extend Nesterov’s technique (SIOPT 2012) for analyzing the RBCD method for minimizing a smooth convex function over a block-separable closed convex set to the aforementioned more general … Read more

Iteration Complexity of Randomized Block-Coordinate Descent Methods for Minimizing a Composite Function

In this paper we develop a randomized block-coordinate descent method for minimizing the sum of a smooth and a simple nonsmooth block-separable convex function and prove that it obtains an $\epsilon$-accurate solution with probability at least $1-\rho$ in at most $O(\tfrac{n}{\epsilon} \log \tfrac{1}{\rho})$ iterations, where $n$ is the number of blocks. For strongly convex functions … Read more