Multi-step discrete-time Zhang neural networks with application to time-varying nonlinear optimization

As a special kind of recurrent neural networks, Zhang neural network (ZNN) has been successfully applied to various time-variant problems solving. In this paper, we first propose a special two-step Zhang et al. discretization (ZeaD) formula and a general two-step ZeaD formula, whose truncation errors are ${O}(\tau^3)$ and ${O}(\tau^2)$, respectively, and $\tau>0$ denotes the sampling … Read more

A class of derivative-free CG projection methods for nonsmooth equations with an application to the LASSO problem

In this paper, based on a modified Gram-Schmidt (MGS) process, we propose a class of derivative-free conjugate gradient (CG) projection methods for nonsmooth equations with convex constraints. Two attractive features of the new class of methods are: (i) its generated direction contains a free vector, which can be set as any vector such that the … Read more

Improved proximal ADMM with partially parallel splitting for multi-block separable convex programming

For a type of multi-block separable convex programming raised in machine learning and statistical inference, we propose a proximal alternating direction method of multiplier (PADMM) with partially parallel splitting, which has the following nice properties: (1) To alleviate the weight of the proximal terms, the restrictions imposed on the proximal parameters are relaxed substantively; (2) … Read more

Proximal ADMM with larger step size for two-block separable convex programs

The alternating direction method of multipliers (ADMM) is a benchmark for solving two-block separable convex programs, and it finds more and more applications in many areas. However, as other first-order methods, ADMM also suffers from low convergence. In this paper, to accelerate the convergence of ADMM, we relax the restriction region of the Fortin and … Read more

The symmetric ADMM with positive-indefinite proximal regularization and its application

Due to update the Lagrangian multiplier twice at each iteration, the symmetric alternating direction method of multipliers (S-ADMM) often performs better than other ADMM-type methods. In practice, some proximal terms with positive definite proximal matrices are often added to its subproblems, and it is commonly known that large proximal parameter of the proximal term often … Read more

A symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming

\ys{This paper introduces} a symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming \ys{with linear equality constraints, which inherits the superiorities of the classical alternating direction method of multipliers (ADMM), and extends the feasible set of the relaxation factor $\alpha$ of the generalized ADMM to the infinite interval $[1,+\infty)$}. … Read more