A family of accelerated inexact augmented Lagrangian methods with applications to image restoration

In this paper, we focus on a class of convex optimization problems subject to equality or inequality constraints and have developed an Accelerated Inexact Augmented Lagrangian Method (AI-ALM). Different relative error criteria are designed to solve the subproblem of AI-ALM inexactly, and the popular used relaxation step is exploited to accelerate the convergence. By a … Read more

An inexact ADMM with proximal-indefinite term and larger stepsize

This work is devoted to developing an inexact ADMM for solving a family of multi-block separable convex optimization problems subject to linear equality constraints, where the problem variables are artificially partitioned into two groups. The first grouped subproblems are solved inexactly and and parallelly under relative error criterions, while the second grouped single subproblem (often … Read more

A New Insight on Augmented Lagrangian Method with Applications in Machine Learning

Motivated by the work [He-Yuan, Balanced augmented Lagrangian method for convex programming, arXiv: 2108.08554v1, (2021)], a novel augmented Lagrangian method with a relaxation step is proposed for solving a family of convex optimization problem subject to equality or inequality constraint. This new method is then extended to solve the multi-block separable convex optimization problem, and … Read more

Accelerated Stochastic Peaceman-Rachford Method for Empirical Risk Minimization

This work is devoted to studying an Accelerated Stochastic Peaceman-Rachford Splitting Method (AS-PRSM) for solving a family of structural empirical risk minimization problems. The objective function of this problem is the sum of a possibly nonsmooth convex function and a finite-sum of smooth convex component functions. The smooth subproblem in AS-PRSM is solved by an … Read more

Iteration complexity analysis of a partial LQP-based alternating direction method of multipliers

In this paper, we consider a prototypical convex optimization problem with multi-block variables and separable structures. By adding the Logarithmic Quadratic Proximal (LQP) regularizer with suitable proximal parameter to each of the first grouped subproblems, we develop a partial LQP-based Alternating Direction Method of Multipliers (ADMM-LQP). The dual variable is updated twice with relatively larger … Read more

A family of multi-parameterized proximal point algorithms

In this paper, a multi-parameterized proximal point algorithm combining with a relaxation step is developed for solving convex minimization problem subject to linear constraints. We show its global convergence and sublinear convergence rate from the prospective of variational inequality. Preliminary numerical experiments on testing a sparse minimization problem from signal processing indicate that the proposed … Read more

Accelerated Symmetric ADMM and Its Applications in Signal Processing

The alternating direction method of multipliers (ADMM) were extensively investigated in the past decades for solving separable convex optimization problems. Fewer researchers focused on exploring its convergence properties for the nonconvex case although it performed surprisingly efficient. In this paper, we propose a symmetric ADMM based on different acceleration techniques for a family of potentially … Read more

Optimal linearized symmetric ADMM for separable convex programming

Due to its wide applications and simple implementations, the Alternating Direction Method of Multipliers (ADMM) has been extensively investigated by researchers from different areas. In this paper, we focus on a linearized symmetric ADMM (LSADMM) for solving the multi- block separable convex minimization model. This LSADMM partitions the data into two group variables and updates … Read more

A projection algorithm based on KKT conditions for convex quadratic semidefinite programming with nonnegative constraints

The dual form of convex quadratic semidefinite programming (CQSDP) problem, with nonnegative constraints, is a 4-block separable convex optimization problem. It is known that,the directly extended 4-block alternating direction method of multipliers (ADMM4d) is very efficient to solve the dual, but its convergence is not guaranteed. In this paper, we reformulate the dual as a … Read more

A One-Parameter Family of Middle Proximal ADMM for Constrained Separable Convex Optimization

This work is devoted to studying a family of Middle Proximal Alternating Direction Method of Multipliers (MP-ADM) for solving multi-block constrained separable convex optimization. Such one-parameter family of MP-ADM combines both Jacobian and Gauss-Seidel types of alternating direction method, and proximal point techniques are only applied to the middle subproblems to promote the convergence. We … Read more