The symmetric ADMM with positive-indefinite proximal regularization and its application

Due to update the Lagrangian multiplier twice at each iteration, the symmetric alternating direction method of multipliers (S-ADMM) often performs better than other ADMM-type methods. In practice, some proximal terms with positive definite proximal matrices are often added to its subproblems, and it is commonly known that large proximal parameter of the proximal term often … Read more

Local Linear Convergence Analysis of Primal–Dual Splitting Methods

In this paper, we study the local linear convergence properties of a versatile class of Primal–Dual splitting methods for minimizing composite non-smooth convex optimization problems. Under the assumption that the non-smooth components of the problem are partly smooth relative to smooth manifolds, we present a unified local convergence analysis framework for these Primal–Dual splitting methods. … Read more

ADMM for monotone operators: convergence analysis and rates

We propose in this paper a unifying scheme for several algorithms from the literature dedicated to the solving of monotone inclusion problems involving compositions with linear continuous operators in infinite dimensional Hilbert spaces. We show that a number of primal-dual algorithms for monotone inclusions and also the classical ADMM numerical scheme for convex optimization problems, … Read more

A symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming

\ys{This paper introduces} a symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming \ys{with linear equality constraints, which inherits the superiorities of the classical alternating direction method of multipliers (ADMM), and extends the feasible set of the relaxation factor $\alpha$ of the generalized ADMM to the infinite interval $[1,+\infty)$}. … Read more

A Note on the Forward-Douglas–Rachford Splitting for Monotone Inclusion and Convex Optimization

We shed light on the structure of the “three-operator” version of the forward-Douglas–Rachford splitting algorithm for finding a zero of a sum of maximally monotone operators $A + B + C$, where $B$ is cocoercive, involving only the computation of $B$ and of the resolvent of $A$ and of $C$, separately. We show that it … Read more

A Hausdorff-type distance, a directional derivative of a set-valued map and applications in set optimization

In this paper, we follow Kuroiwa’s set approach in set optimization, which proposes to compare values of a set-valued objective map $F$ respect to various set order relations. We introduce a Hausdorff-type distance relative to an ordering cone between two sets in a Banach space and use it to define a directional derivative for $F$. … Read more

Facially dual complete (nice) cones and lexicographic tangents

We study the boundary structure of closed convex cones, with a focus on facially dual complete (nice) cones. These cones form a proper subset of facially exposed convex cones, and they behave well in the context of duality theory for convex optimization. Using the well-known and very commonly used concept of tangent cones in nonlinear … Read more

Partially separable convexly-constrained optimization with non-Lipschitz singularities and its complexity

An adaptive regularization algorithm using high-order models is proposed for partially separable convexly constrained nonlinear optimization problems whose objective function contains non-Lipschitzian $\ell_q$-norm regularization terms for $q\in (0,1)$. It is shown that the algorithm using an $p$-th order Taylor model for $p$ odd needs in general at most $O(\epsilon^{-(p+1)/p})$ evaluations of the objective function and … Read more

Several variants of the primal-dual hybrid gradient algorithm with applications

By reviewing the primal-dual hybrid algorithm (PDHA) proposed by He, You and Yuan (SIAM J. Imaging Sci. 2014;7(4):2526-2537), in this paper we introduce four improved schemes for solving a class of generalized saddle-point problems. By making use of the variational inequality, weaker conditions are presented to ensure the global convergence of the proposed algorithms, where … Read more

Iteration-Complexity of a Linearized Proximal Multiblock ADMM Class for Linearly Constrained Nonconvex Optimization Problems

This paper analyzes the iteration-complexity of a class of linearized proximal multiblock alternating direction method of multipliers (ADMM) for solving linearly constrained nonconvex optimization problems. The subproblems of the linearized ADMM are obtained by partially or fully linearizing the augmented Lagrangian with respect to the corresponding minimizing block variable. The derived complexity bounds do not … Read more