Discerning the linear convergence of ADMM for structured convex optimization through the lens of variational analysis

Despite the rich literature, the linear convergence of alternating direction method of multipliers (ADMM) has not been fully understood even for the convex case. For example, the linear convergence of ADMM can be empirically observed in a wide range of applications, while existing theoretical results seem to be too stringent to be satisfied or too … Read more

Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems

We understand linear convergence of some first-order methods such as the proximal gradient method (PGM), the proximal alternating linearized minimization (PALM) algorithm and the randomized block coordinate proximal gradient method (R-BCPGM) for minimizing the sum of a smooth convex function and a nonsmooth convex function from a variational analysis perspective. We introduce a new analytic … Read more

Error bounds for rank constrained optimization problems and applications

This paper is concerned with the rank constrained optimization problem whose feasible set is the intersection of the rank constraint set $\mathcal{R}=\!\big\{X\in\mathbb{X}\ |\ {\rm rank}(X)\le \kappa\big\}$ and a closed convex set $\Omega$. We establish the local (global) Lipschitzian type error bounds for estimating the distance from any $X\in \Omega$ ($X\in\mathbb{X}$) to the feasible set and … Read more

Strong slopes of a vector-valued map and applications in the study of error bounds, weak sharp minima and calmness

Using Hiriart-Urruty’s signed distance function, we present new definitions of strong slopes for a vector-valued map recently introduced in [E.M. Bednarczuk, A.Y., Kruger, Error bounds for vector-valued functions on metric spaces. Vietnam J. Math. 40 (2012), no. 2-3, 165-180]. With the new presentation, we are able to show that these slopes enjoy most properties of … Read more

Calmness of linear programs under perturbations of all data: characterization and modulus

This paper provides operative point-based formulas (only involving the nominal data, and not data in a neighborhood) for computing or estimating the calmness modulus of the optimal set (argmin) mapping in linear optimization under uniqueness of nominal optimal solutions. Our analysis is developed in two different parametric settings. First, in the framework of canonical perturbations … Read more

From convergence principles to stability and optimality conditions

We show in a rather general setting that Hoelder and Lipschitz stability properties of solutions to variational problems can be characterized by convergence of more or less abstract iteration schemes. Depending on the principle of convergence, new and intrinsic stability conditions can be derived. Our most abstract models are (multi-) functions on complete metric spaces. … Read more

Generalized differentiation with positively homogeneous maps: Applications in set-valued analysis and metric regularity

We propose a new concept of generalized differentiation of set-valued maps that captures the first order information. This concept encompasses the standard notions of Frechet differentiability, strict differentiability, calmness and Lipschitz continuity in single-valued maps, and the Aubin property and Lipschitz continuity in set-valued maps. We present calculus rules, sharpen the relationship between the Aubin … Read more