A general inertial proximal point algorithm for mixed variational inequality problem

In this paper, we first propose a general inertial \emph{proximal point algorithm} (PPA) for the mixed \emph{variational inequality} (VI) problem. Based on our knowledge, without stronger assumptions, convergence rate result is not known in the literature for inertial type PPAs. Under certain conditions, we are able to establish the global convergence and nonasymptotic $O(1/k)$ convergence … Read more

Inertial Proximal ADMM for Linearly Constrained Separable Convex Optimization

The \emph{alternating direction method of multipliers} (ADMM) is a popular and efficient first-order method that has recently found numerous applications, and the proximal ADMM is an important variant of it. The main contributions of this paper are the proposition and the analysis of a class of inertial proximal ADMMs, which unify the basic ideas of … Read more

On the non-ergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming

In this paper, we consider the linearly constrained composite convex optimization problem, whose objective is a sum of a smooth function and a possibly nonsmooth function. We propose an inexact augmented Lagrangian (IAL) framework for solving the problem. The stopping criterion used in solving the augmented Lagrangian (AL) subproblem in the proposed IAL framework is … Read more

Global Convergence of Unmodified 3-Block ADMM for a Class of Convex Minimization Problems

The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2-block ADMM have been studied extensively in the literature. Specifically, it has been proven that the 2-block ADMM globally converges for any penalty parameter $\gamma>0$. In this … Read more

Iteration Complexity Analysis of Multi-Block ADMM for a Family of Convex Minimization without Strong Convexity

The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems due to its superior practical performance. On the theoretical side however, a counterexample was shown in [7] indicating that the multi-block ADMM for minimizing the sum of $N$ $(N\geq 3)$ convex functions with $N$ block variables linked by linear … Read more

New Ranks for Even-Order Tensors and Their Applications in Low-Rank Tensor Optimization

In this paper, we propose three new tensor decompositions for even-order tensors corresponding respectively to the rank-one decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank in this paper. We discuss the bounds … Read more

Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization

In this paper we study stochastic quasi-Newton methods for nonconvex stochastic optimization, where we assume that only stochastic information of the gradients of the objective function is available via a stochastic first-order oracle (SFO). Firstly, we propose a general framework of stochastic quasi-Newton methods for solving nonconvex stochastic optimization. The proposed framework extends the classic … Read more

Inertial primal-dual algorithms for structured convex optimization

The primal-dual algorithm recently proposed by Chambolle \& Pock (abbreviated as CPA) for structured convex optimization is very efficient and popular. It was shown by Chambolle \& Pock in \cite{CP11} and also by Shefi \& Teboulle in \cite{ST14} that CPA and variants are closely related to preconditioned versions of the popular alternating direction method of … Read more

On the Sublinear Convergence Rate of Multi-Block ADMM

The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite of its success in practice, the convergence of the standard ADMM for minimizing the sum of $N$ $(N\geq 3)$ convex functions whose variables are linked by linear constraints, has remained unclear for a very long time. Recently, Chen … Read more

On the Global Linear Convergence of the ADMM with Multi-Block Variables

The alternating direction method of multipliers (ADMM) has been widely used for solving structured convex optimization problems. In particular, the ADMM can solve convex programs that minimize the sum of $N$ convex functions with $N$-block variables linked by some linear constraints. While the convergence of the ADMM for $N=2$ was well established in the literature, … Read more