A Practical Relative Error Criterion for Augmented Lagrangians

This paper develops a new error criterion for the approximate minimization of augmented Lagrangian subproblems. This criterion is practical in the sense that it requires only information that is ordinarily readily available, such as the gradient (or a subgradient) of the augmented Lagrangian. It is also “relative” in the sense of relative error criteria for … Read more

Complexity of variants of Tseng’s modified F-B splitting and Korpelevich’s methods for generalized variational inequalities with applications to saddle point and convex optimization problems

In this paper, we consider both a variant of Tseng’s modified forward-backward splitting method and an extension of Korpelevich’s method for solving generalized variational inequalities with Lipschitz continuous operators. By showing that these methods are special cases of the hybrid proximal extragradient (HPE) method introduced by Solodov and Svaiter, we derive iteration-complexity bounds for them … Read more

Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: a Generic Algorithmic Framework

In this paper we present a generic algorithmic framework, namely, the accelerated stochastic approximation (AC-SA) algorithm, for solving strongly convex stochastic composite optimization (SCO) problems. While the classical stochastic approximation (SA) algorithms are asymptotically optimal for solving differentiable and strongly convex problems, the AC-SA algorithm, when employed with proper stepsize policies, can achieve optimal or … Read more

A splitting method for separate convex programming with linking linear constraints

We consider the separate convex programming problem with linking linear constraints, where the objective function is in the form of the sum of m individual functions without crossed variables. The special case with m=2 has been well studied in the literature and some algorithms are very influential, e.g. the alternating direction method. The research for … Read more

Alternating proximal algorithms for constrained variational inequalities. Application to domain decomposition for PDE’s

Let $\cX,\cY,\cZ$ be real Hilbert spaces, let $f : \cX \rightarrow \R\cup\{+\infty\}$, $g : \cY \rightarrow \R\cup\{+\infty\}$ be closed convex functions and let $A : \cX \rightarrow \cZ$, $B : \cY \rightarrow \cZ$ be linear continuous operators. Let us consider the constrained minimization problem $$ \min\{f(x)+g(y):\quad Ax=By\}.\leqno (\cP)$$ Given a sequence $(\gamma_n)$ which tends toward … Read more

Proximal alternating direction-based contraction methods for separable linearly constrained convex optimization

Alternating direction method (ADM) has been well studied in the context of linearly constrained convex programming problems. Recently, because of its significant efficiency and easy implementation in novel applications, ADM is extended to the case where the number of separable parts is a finite number. The algorithmic framework of the extended method consists of two … Read more

L1 Minimization via Randomized First Order Algorithms

In this paper we propose randomized first-order algorithms for solving bilinear saddle points problems. Our developments are motivated by the need for sublinear time algorithms to solve large-scale parametric bilinear saddle point problems where cheap online assessment of solution quality is crucial. We present the theoretical efficiency estimates of our algorithms and discuss a number … Read more

Information Geometry and Primal-Dual Interior-point Algorithms

In this paper, we study polynomial-time interior-point algorithms in view of information geometry. We introduce an information geometric structure for a conic linear program based on a self-concordant barrier function. Riemannian metric is defined with the Hessian of the barrier function. We introduce two connections $\nabla$ and $\nabla^*$ which roughly corresponds to the primal and … Read more

Generic nondegeneracy in convex optimization

We show that minimizers of convex functions subject to almost all linear perturbations are nondegenerate. An analogous result holds more generally, for lower-C^2 functions. CitationCornell University, School of Operations Research and Information Engineering, 206 Rhodes Hall Cornell University Ithaca, NY 14853. May 2010. ArticleDownload View PDF

On Equivalence of Semidefinite Relaxations for Quadratic Matrix Programming

In this paper, we analyze two popular semidefinite programming \SDPb relaxations for quadratically constrained quadratic programs \QCQPb with matrix variables. These are based on \emph{vector-lifting} and on \emph{matrix lifting} and are of different size and expense. We prove, under mild assumptions, that these two relaxations provide equivalent bounds. Thus, our results provide a theoretical guideline … Read more