A proximal-perturbed Bregman ADMM for solving nonsmooth and nonconvex optimization problems

In this paper, we focus on a linearly constrained composite minimization problem whose objective function is possibly nonsmooth and nonconvex. Unlike the traditional construction of augmented Lagrangian function, we provide a proximal-perturbed augmented Lagrangian and then develop a new Bregman Alternating Direction Method of Multipliers (ADMM). Under mild assumptions, we show that the novel augmented … Read more

An inexact ADMM for separable nonconvex and nonsmooth optimization

An Inexact Alternating Direction Method of Multiplies (I-ADMM) with an expansion linesearch step was developed for solving a family of separable minimization problems subject to linear constraints, where the objective function is the sum of a smooth but possibly nonconvex function and a possibly nonsmooth nonconvex function. Global convergence and linear convergence rate of the … Read more

Double-proximal augmented Lagrangian methods with improved convergence condition

In this paper, we consider a family of linearly constrained convex minimization problems whose objective function is not necessarily smooth. A basic double-proximal augmented Lagrangian method (DP-ALM) is developed, which enjoys a flexible dual stepsize and a proximal subproblem with relatively smaller proximal parameter. By a novel prediction-correction reformulation for the proposed DP-ALM and by … Read more

Generalized asymmetric forward-backward-adjoint algorithms for convex-concave saddle-point problem

The convex-concave minimax problem, also known as the saddle-point problem, has been extensively studied from various aspects including the algorithm design, convergence condition and complexity. In this paper, we propose a generalized asymmetric forward-backward-adjoint algorithm (G-AFBA) to solve such a problem by utilizing both the proximal techniques and the extrapolation of primal-dual updates. Besides applying … Read more

Convergence Analysis on A Data-deriven Inexact Proximal-indefinite Stochastic ADMM

In this paper, we propose an Inexact Proximal-indefinite Stochastic ADMM (abbreviated as IPS-ADMM) to solve a class of separable convex optimization problems whose objective functions consist of two parts: one is an average of many smooth convex functions and the other is a convex but potentially nonsmooth function. The involved smooth subproblem is tackled by … Read more

A family of accelerated inexact augmented Lagrangian methods with applications to image restoration

In this paper, we focus on a class of convex optimization problems subject to equality or inequality constraints and have developed an Accelerated Inexact Augmented Lagrangian Method (AI-ALM). Different relative error criteria are designed to solve the subproblem of AI-ALM inexactly, and the popular used relaxation step is exploited to accelerate the convergence. By a … Read more

An inexact ADMM with proximal-indefinite term and larger stepsize

In this paper, an inexact Alternating Direction Method of Multipliers (ADMM) has been proposed for solving the two-block separable convex optimization problem subject to linear equality constraints. The first resulting subproblem is solved inexactly under relative error criterion, while another subproblem called regularization problem is solved inexactly by introducing an indefinite proximal term. Meanwhile, the … Read more

A New Insight on Augmented Lagrangian Method with Applications in Machine Learning

By exploiting double-penalty terms for the primal subproblem, we develop a novel relaxed augmented Lagrangian method for solving a family of convex optimization problems subject to equality or inequality constraints. This new method is then extended to solve a general multi-block separable convex optimization problem, and two related primal-dual hybrid gradient algorithms are also discussed. … Read more

Accelerated Stochastic Peaceman-Rachford Method for Empirical Risk Minimization

This work is devoted to studying an Accelerated Stochastic Peaceman-Rachford Splitting Method (AS-PRSM) for solving a family of structural empirical risk minimization problems. The objective function to be optimized is the sum of a possibly nonsmooth convex function and a finite-sum of smooth convex component functions. The smooth subproblem in AS-PRSM is solved by a stochastic gradient method using variance reduction … Read more

Iteration complexity analysis of a partial LQP-based alternating direction method of multipliers

In this paper, we consider a prototypical convex optimization problem with multi-block variables and separable structures. By adding the Logarithmic Quadratic Proximal (LQP) regularizer with suitable proximal parameter to each of the first grouped subproblems, we develop a partial LQP-based Alternating Direction Method of Multipliers (ADMM-LQP). The dual variable is updated twice with relatively larger … Read more