New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure

Motivated by recent work of Renegar, we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f^* = \min_{x \in Q} f(x), where we presume knowledge of a strict lower bound f_slb < f^*. [Indeed, f_slb is naturally ... Read more

First-Order Algorithms for Convex Optimization with Nonseparate Objective and Coupled Constraints

In this paper we consider a block-structured convex optimization model, where in the objective the block-variables are nonseparable and they are further linearly coupled in the constraint. For the 2-block case, we propose a number of first-order algorithms to solve this model. First, the alternating direction method of multipliers (ADMM) is extended, assuming that it … Read more

On the convergence of the Sakawa-Shindo algorithm in stochastic control

We analyze an algorithm for solving stochastic control problems, based on Pontryagin’s maximum principle, due to Sakawa and Shindo in the deterministic case and extended to the stochastic setting by Mazliak. We assume that either the volatility is an affine function of the state, or the dynamics are linear. We obtain a monotone decrease of … Read more

Performance of First- and Second-Order Methods for L1-Regularized Least Squares Problems

We study the performance of first- and second-order optimization methods for l1-regularized sparse least-squares problems as the conditioning and the dimensions of the problem increase up to one trillion. A rigorously defined generator is presented which allows control of the dimensions, the conditioning and the sparsity of the problem. The generator has very low memory … Read more

Object-Parallel Infrastructure for Implementing First-Order Methods, with an Example Application to LASSO

We describe the design of a C++ vector-manipulation substrate that allows first-order optimization algorithms to be expressed in a concise and readable manner, yet still achieve high performance in parallel computing environments. We use standard object-oriented techniques of encapsulation and operator overloading, combined with a novel “symbolic temporaries” delayed-evaluation system that greatly reduces the overhead … Read more

Fast Bundle-Level Type Methods for unconstrained and ball-constrained convex optimization

It has been shown in \cite{Lan13-1} that the accelerated prox-level (APL) method and its variant, the uniform smoothing level (USL) method, have optimal iteration complexity for solving black-box and structured convex programming problems without requiring the input of any smoothness information. However, these algorithms require the assumption on the boundedness of the feasible set and … Read more

On the Information-Adaptive Variants of the ADMM: an Iteration Complexity Perspective

Designing algorithms for an optimization model often amounts to maintaining a balance between the degree of information to request from the model on the one hand, and the computational speed to expect on the other hand. Naturally, the more information is available, the faster one can expect the algorithm to converge. The popular algorithm of … Read more

Efficient First-Order Methods for Linear Programming and Semidefinite Programming

We present a simple transformation of any linear program or semidefinite program into an equivalent convex optimization problem whose only constraints are linear equations. The objective function is defined on the whole space, making virtually all subgradient methods be immediately applicable. We observe, moreover, that the objective function is naturally “smoothed,” thereby allowing most first-order … Read more

On the ergodic convergence rates of a first-order primal-dual algorithm

We revisit the proofs of convergence for a first order primal-dual algorithm for convex optimization which we have studied a few years ago. In particular, we prove rates of convergence for a more general version, with simpler proofs and more complete results. Article Download View On the ergodic convergence rates of a first-order primal-dual algorithm

Intermediate gradient methods for smooth convex problems with inexact oracle

Between the robust but slow (primal or dual) gradient methods and the fast but sensitive to errors fast gradient methods, our goal in this paper is to develop first-order methods for smooth convex problems with intermediate speed and intermediate sensitivity to errors. We develop a general family of first-order methods, the Intermediate Gradient Method (IGM), … Read more