A contraction method with implementable proximal regularization for linearly constrained convex programming

The proximal point algorithm (PPA) is classical, and it is implicit in the sense that the resulting proximal subproblems may be as difficult as the original problem. In this paper, we show that with appropriate choices of proximal parameters, the application of PPA to the linearly constrained convex programming can result in easy proximal subproblems. … Read more

Convergence analysis of primal-dual algorithms for total variation image restoration

Recently, some attractive primal-dual algorithms have been proposed for solving a saddle-point problem, with particular applications in the area of total variation (TV) image restoration. This paper focuses on the convergence analysis of existing primal-dual algorithms and shows that the involved parameters of those primal-dual algorithms (including the step sizes) can be significantly enlarged if … Read more

On the acceleration of augmented Lagrangian method for linearly constrained optimization

The classical augmented Lagrangian method (ALM) plays a fundamental role in algorithmic development of constrained optimization. In this paper, we mainly show that Nesterov’s influential acceleration techniques can be applied to accelerate ALM, thus yielding an accelerated ALM whose iteration-complexity is O(1/k^2) for linearly constrained convex programming. As a by-product, we also show easily that … Read more

An accelerated inexact proximal point algorithm for convex minimization

The proximal point algorithm (PPA) is classical and popular in the community of Optimization. In practice, inexact PPAs which solves the involved proximal subproblems approximately subject to certain inexact criteria are truly implementable. In this paper, we first propose an inexact PPA with a new inexact criterion for solving convex minimization, and show that the … Read more

A splitting method for separate convex programming with linking linear constraints

We consider the separate convex programming problem with linking linear constraints, where the objective function is in the form of the sum of m individual functions without crossed variables. The special case with m=2 has been well studied in the literature and some algorithms are very influential, e.g. the alternating direction method. The research for … Read more

The Globally Uniquely Solvable Property of Second-Order Cone Linear Complementarity Problems

The globally uniquely solvable (GUS) property of the linear transformation of the linear complementarity problems over symmetric cones has been studied recently by Gowda et al. via the approach of Euclidean Jordan algebra. In this paper, we contribute a new approach to characterizing the GUS property of the linear transformation of the second-order cone linear … Read more

The unified framework of some proximal-based decomposition methods for monotone variational inequalities with separable structure

Some existing decomposition methods for solving a class of variational inequalities (VI) with separable structures are closely related to the classical proximal point algorithm, as their decomposed sub-VIs are regularized by proximal terms. Differing in whether the generated sub-VIs are suitable for parallel computation, these proximal-based methods can be categorized into the parallel decomposition methods … Read more

Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization

The nuclear norm is widely used to induce low-rank solutions for many optimization problems with matrix variables. Recently, it has been shown that the augmented Lagrangian method (ALM) and the alternating direction method (ADM) are very efficient for many convex programming problems arising from various applications, provided that the resulting subproblems are sufficiently simple to … Read more

Recovering low-rank and sparse components of matrices from incomplete and noisy observations

Many applications arising in a variety of fields can be well illustrated by the task of recovering the low-rank and sparse components of a given matrix. Recently, it is discovered that this NP-hard task can be well accomplished, both theoretically and numerically, via heuristically solving a convex relaxation problem where the widely-acknowledged nuclear norm and … Read more

Alternating directions based contraction method for generally separable linearly constrained convex programming problems

The classical alternating direction method (ADM) has been well studied in the context of linearly constrained convex programming problems and variational inequalities where both the involved operators and constraints are separable into two parts. In particular, recentness has witnessed a number of novel applications arising in diversified areas (e.g. Image Processing and Statistics), for which … Read more