A Bundle Method for Exploiting Additive Structure in Difficult Optimization Problems

This paper describes a bundle method for (approximately) minimizing complicated nonsmooth convex functions with additive structure, with the primary goal of computing bounds on the solution values of difficult optimization problems such as stochastic integer programs. The method combines features that have appeared in previously proposed bundle methods, but not in the particular configuration we … Read more

An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization

Mini-batch optimization has proven to be a powerful paradigm for large-scale learning. However, the state of the art parallel mini-batch algorithms assume synchronous operation or cyclic update orders. When worker nodes are heterogeneous (due to different computational capabilities or different communication delays), synchronous and cyclic operations are inefficient since they will leave workers idle waiting … Read more

Probabilistic optimization via approximate p-efficient points and bundle methods

For problems when decisions are taken prior to observing the realization of underlying random events, probabilistic constraints are an important modelling tool if reliability is a concern. A key concept to numerically dealing with probabilistic constraints is that of p-efficient points. By adopting a dual point of view, we develop a solution framework that includes … Read more

On the Step Size of Symmetric Alternating Directions Method of Multipliers

The alternating direction method of multipliers (ADMM) is an application of the Douglas-Rachford splitting method; and the symmetric version of ADMM which updates the Lagrange multiplier twice at each iteration is an application of the Peaceman-Rachford splitting method. Sometimes the symmetric ADMM works empirically; but theoretically its convergence is not guaranteed. It was recently found … Read more

$\varepsilon- Subdifferential of Set-valued Map and Its Application

In this paper, firstly, the concept of $\varepsilon-$strictly efficient subdifferential for set-valued map is introduced in Hausdorff locally convex topological vector spaces. Secondly, a characterization of this subdifferential by scalarization and the generalized $\varepsilon-$ Moreau-Rockafellar type theorem for set-valued maps are established. Finally, the necessary optimality condition of the constraint set-valued optimization problem for $\varepsilon-$ … Read more

Global Convergence of Unmodified 3-Block ADMM for a Class of Convex Minimization Problems

The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2-block ADMM have been studied extensively in the literature. Specifically, it has been proven that the 2-block ADMM globally converges for any penalty parameter $\gamma>0$. In this … Read more

An Inexact Proximal Algorithm for Pseudomonotone and Quasimonotone Variational Inequalities

In this paper we introduce an inexact proximal point algorithm using proximal distances for solving variational inequality problems when the mapping is pseudomonotone or quasimonotone. Under some natural assumptions we prove that the sequence generates by the algorithm is convergent for the pseudomonotone case and weakly convergent for the quasimonotone ones. This approach unifies the … Read more

A New Perspective on Boosting in Linear Regression via Subgradient Optimization and Relatives

In this paper we analyze boosting algorithms in linear regression from a new perspective: that of modern first-order methods in convex optimization. We show that classic boosting algorithms in linear regression, namely the incremental forward stagewise algorithm (FS-epsilon) and least squares boosting (LS-Boost-epsilon), can be viewed as subgradient descent to minimize the loss function defined … Read more

First-Order Algorithms for Convex Optimization with Nonseparate Objective and Coupled Constraints

In this paper we consider a block-structured convex optimization model, where in the objective the block-variables are nonseparable and they are further linearly coupled in the constraint. For the 2-block case, we propose a number of first-order algorithms to solve this model. First, the alternating direction method of multipliers (ADMM) is extended, assuming that it … Read more

Solving nonsmooth convex optimization with complexity (\eps^{-1/2})$

This paper describes an algorithm for solving structured nonsmooth convex optimization problems using OSGA, a first-order method with the complexity $O(\eps^{-2})$ for Lipschitz continuous nonsmooth problems and $O(\eps^{-1/2})$ for smooth problems with Lipschitz continuous gradient. If the nonsmoothness of the problem is manifested in a structured way, we reformulate the problem in a form that … Read more