Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria

We consider optimization algorithms that successively minimize simple Taylor-like models of the objective function. Methods of Gauss-Newton type for minimizing the composition of a convex function and a smooth map are common examples. Our main result is an explicit relationship between the step-size of any such algorithm and the slope of the function at a … Read more

Error bounds, quadratic growth, and linear convergence of proximal methods

We show that the the error bound property, postulating that the step lengths of the proximal gradient method linearly bound the distance to the solution set, is equivalent to a standard quadratic growth condition. We exploit this equivalence in an analysis of asymptotic linear convergence of the proximal gradient algorithm for structured problems, which lack … Read more

Variational Analysis of the Crouzeix Ratio

Let $W(A)$ denote the field of values (numerical range) of a matrix $A$. For any polynomial $p$ and matrix $A$, define the Crouzeix ratio to have numerator $\max\left\{|p(\zeta)|:\zeta\in W(A)\right\}$ and denominator $\|p(A)\|_2$. M.~Crouzeix’s 2004 conjecture postulates that the globally minimal value of the Crouzeix ratio is $1/2$, over all polynomials $p$ of any degree and … Read more

Alternating projections and coupling slope

We consider the method of alternating projections for finding a point in the intersection of two possibly nonconvex closed sets. We present a local linear convergence result that makes no regularity assumptions on either set (unlike previous results), while at the same time weakening standard transversal intersection assumptions. The proof grows out of a study … Read more

Orthogonal invariance and identifiability

Orthogonally invariant functions of symmetric matrices often inherit properties from their diagonal restrictions: von Neumann’s theorem on matrix norms is an early example. We discuss the example of “identifiability”, a common property of nonsmooth functions associated with the existence of a smooth manifold of approximate critical points. Identifiability (or its synonym, “partial smoothness”) is the … Read more

Trajectories of Descent

Steepest descent drives both theory and practice of nonsmooth optimization. We study slight relaxations of two influential notions of steepest descent curves — curves of maximal slope and solutions to evolution equations. In particular, we provide a simple proof showing that lower-semicontinuous functions that are locally Lipschitz continuous on their domains — functions playing a … Read more

Clarke subgradients for directionally Lipschitzian stratifiable functions

Using a geometric argument, we show that under a reasonable continuity condition, the Clarke subdifferential of a semi-algebraic (or more generally stratifiable) directionally Lipschitzian function admits a simple form: the normal cone to the domain and limits of gradients generate the entire Clarke subdifferential. The characterization formula we obtain unifies various apparently disparate results that … Read more

Optimality, identifiability, and sensitivity

Around a solution of an optimization problem, an “identifiable” subset of the feasible region is one containing all nearby solutions after small perturbations to the problem. A quest for only the most essential ingredients of sensitivity analysis leads us to consider identifiable sets that are “minimal”. This new notion lays a broad and intuitive variational-analytic … Read more

Tilt stability, uniform quadratic growth, and strong metric regularity of the subdifferential.

We prove that uniform second order growth, tilt stability, and strong metric regularity of the subdifferential — three notions that have appeared in entirely different settings — are all essentially equivalent for any lower-semicontinuous, extended-real-valued function. Citation Cornell University, School of Operations Research and Information Engineering, 206 Rhodes Hall Cornell University Ithaca, NY 14853. May … Read more

Partial Smoothness,Tilt Stability, and Generalized Hessians

We compare two recent variational-analytic approaches to second-order conditions and sensitivity analysis for nonsmooth optimization. We describe a broad setting where computing the generalized Hessian of Mordukhovich is easy. In this setting, the idea of tilt stability introduced by Poliquin and Rockafellar is equivalent to a classical smooth second-order condition. Article Download View Partial Smoothness,Tilt … Read more