The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent

The alternating direction method of multipliers (ADMM) is now widely used in many fields, and its convergence was proved when two blocks of variables are alternatively updated. It is strongly desirable and practically valuable to extend ADMM directly to the case of a multi-block convex minimization problem where its objective function is the sum of … Read more

Quantitative Characterizations of Regularity Properties of Collections of Sets

Several primal and dual characterizations of regularity properties of collections of sets in normed linear spaces are discussed. Relationships between regularity properties of collections of sets and those of set-valued mappings are provided. CitationJOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS (2015) 164:41–67ArticleDownload View PDF

Smooth minimization of nonsmooth functions with parallel coordinate descent methods

We study the performance of a family of randomized parallel coordinate descent methods for minimizing the sum of a nonsmooth and separable convex functions. The problem class includes as a special case L1-regularized L1 regression and the minimization of the exponential loss (“AdaBoost problem”). We assume the input data defining the loss function is contained … Read more

Primal-dual methods for solving infinite-dimensional games

In this paper we show that the infinite-dimensional differential games with simple objective functional can be solved in a finite-dimensional dual form in the space of dual multipliers for the constraints related to the end points of the trajectories. The primal solutions can be easily reconstructed by the appropriate dual subgradient schemes. The suggested schemes … Read more

Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization

We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. … Read more

Large-scale optimization with the primal-dual column generation method

The primal-dual column generation method (PDCGM) is a general-purpose column generation technique that relies on the primal-dual interior point method to solve the restricted master problems. The use of this interior point method variant allows to obtain suboptimal and well-centered dual solutions which naturally stabilizes the column generation. A reduction in the number of calls … Read more

An Inexact Successive Quadratic Approximation Method for Convex L-1 Regularized Optimization

We study a Newton-like method for the minimization of an objective function $\phi$ that is the sum of a smooth convex function and an $\ell_1$ regularization term. This method, which is sometimes referred to in the literature as a proximal Newton method, computes a step by minimizing a piecewise quadratic model $q_k$ of the objective … Read more

Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization

In this paper, we present a new stochastic algorithm, namely the stochastic block mirror descent (SBMD) method for solving large-scale nonsmooth and stochastic optimization problems. The basic idea of this algorithm is to incorporate the block-coordinate decomposition and an incremental block averaging scheme into the classic (stochastic) mirror-descent method, in order to significantly reduce the … Read more

Quadratic growth and critical point stability of semi-algebraic functions

We show that quadratic growth of a semi-algebraic function is equivalent to strong metric subregularity of the subdifferential — a kind of stability of generalized critical points. In contrast, this equivalence can easily fail outside of the semi-algebraic setting. Citation13 pages, September, 2013ArticleDownload View PDF

On the Coupled Continuous Knapsack Problems: Projection Onto the Volume Constrained Gibbs N-Simplex

Coupled continuous quadratic knapsack problems (CCK) are introduced in the present study. The solution of a CCK problem is equivalent to the projection of an arbitrary point onto the volume constrained Gibbs N-simplex, which has a wide range of applications in computational science and engineering. Three algorithms have been developed in the present study to … Read more