Modified alternating direction methods for the modified multiple-sets split feasibility problems

Inthispaper, weproposetwonewmultiple-setssplitfeasibilityproblem(MSFP)models, where the MSFP requires to find a point closest to the intersection of a family of closed convex sets in one space, such that its image under a linear transformation will be closest to the intersection of another family of closed convex sets in the image space. This problem arises in image restoration, … Read more

Convergence rate and iteration complexity on the alternating direction method of multipliers with a substitution procedure for separable convex programming

Recently, in [17] we have showed the first possibility of combining the Douglas-Rachford alternating direction method of multipliers (ADMM) with a Gaussian back substitution procedure for solving a convex minimization model with a general separable structure. This paper is a further study on theoretical aspects of this theme. We first derive a general algorithmic framework … Read more

On non-ergodic convergence rate of Douglas-Rachford alternating direction method of multipliers

Recently, a worst-case O(1/t) convergence rate was established for the Douglas-Rachford alternating direction method of multipliers in an ergodic sense. This note proposes a novel approach to derive the same convergence rate while in a non-ergodic sense. Article Download View On non-ergodic convergence rate of Douglas-Rachford alternating direction method of multipliers

On the O(1/t) convergence rate of alternating direction method

The old alternating direction method (ADM) has found many new applications recently, and its empirical efficiency has been well illustrated in various fields. However, the estimate of ADM’s convergence rate remains a theoretical challenge for a few decades. In this note, we provide a uniform proof to show the O(1/t) convergence rate for both the … Read more

A relaxed customized proximal point algorithm for separable convex programming

The alternating direction method (ADM) is classical for solving a linearly constrained separable convex programming problem (primal problem), and it is well known that ADM is essentially the application of a concrete form of the proximal point algorithm (PPA) (more precisely, the Douglas-Rachford splitting method) to the corresponding dual problem. This paper shows that an … Read more

On the O(1/t) convergence rate of the projection and contraction methods for variational inequalities with Lipschitz continuous monotone operators

Recently, Nemirovski’s analysis indicates that the extragradient method has the $O(1/t)$ convergence rate for variational inequalities with Lipschitz continuous monotone operators. For the same problems, in the last decades, we have developed a class of Fej\’er monotone projection and contraction methods. Until now, only convergence results are available to these projection and contraction methods, though … Read more

Convergence and Convergence Rate of Stochastic Gradient Search in the Case of Multiple and Non-Isolated Extrema

The asymptotic behavior of stochastic gradient algorithms is studied. Relying on some results of differential geometry (Lojasiewicz gradient inequality), the almost sure point-convergence is demonstrated and relatively tight almost sure bounds on the convergence rate are derived. In sharp contrast to all existing result of this kind, the asymptotic results obtained here do not require … Read more

Approximate Primal Solutions and Rate Analysis in Dual Subgradient Methods

We study primal solutions obtained as a by-product of subgradient methods when solving the Lagrangian dual of a primal convex constrained optimization problem (possibly nonsmooth). The existing literature on the use of subgradient methods for generating primal optimal solutions is limited to the methods producing such solutions only asymptotically (i.e., in the limit as the … Read more

A New Conjugate Gradient Algorithm Incorporating Adaptive Ellipsoid Preconditioning

The conjugate gradient (CG) algorithm is well-known to have excellent theoretical properties for solving linear systems of equations $Ax = b$ where the $n\times n$ matrix $A$ is symmetric positive definite. However, for extremely ill-conditioned matrices the CG algorithm performs poorly in practice. In this paper, we discuss an adaptive preconditioning procedure which improves the … Read more