Local Convergence of the Heavy-ball Method and iPiano for Non-convex Optimization

A local convergence result for abstract descent methods is proved. The sequence of iterates is attracted by a local (or global) minimum, stays in its neighborhood and converges within this neighborhood. This result allows algorithms to exploit local properties of the objective function. In particular, the abstract theory in this paper applies to the inertial … Read more

Optimality, identifiability, and sensitivity

Around a solution of an optimization problem, an “identifiable” subset of the feasible region is one containing all nearby solutions after small perturbations to the problem. A quest for only the most essential ingredients of sensitivity analysis leads us to consider identifiable sets that are “minimal”. This new notion lays a broad and intuitive variational-analytic … Read more

Tilt stability, uniform quadratic growth, and strong metric regularity of the subdifferential.

We prove that uniform second order growth, tilt stability, and strong metric regularity of the subdifferential — three notions that have appeared in entirely different settings — are all essentially equivalent for any lower-semicontinuous, extended-real-valued function. CitationCornell University, School of Operations Research and Information Engineering, 206 Rhodes Hall Cornell University Ithaca, NY 14853. May 2012.ArticleDownload … Read more

Lipschitz behavior of the robust regularization

To minimize or upper-bound the value of a function “robustly”, we might instead minimize or upper-bound the “epsilon-robust regularization”, defined as the map from a point to the maximum value of the function within an epsilon-radius. This regularization may be easy to compute: convex quadratics lead to semidefinite-representable regularizations, for example, and the spectral radius … Read more

Local convergence for alternating and averaged nonconvex projections

The idea of a finite collection of closed sets having “strongly regular intersection” at a given point is crucial in variational analysis. We show that this central theoretical tool also has striking algorithmic consequences. Specifically, we consider the case of two sets, one of which we assume to be suitably “regular” (special cases being convex … Read more

Robust regularization

Given a real function on a Euclidean space, we consider its “robust regularization”: the value of this new function at any given point is the maximum value of the original function in a fixed neighbourhood of the point in question. This construction allows us to impose constraints in an optimization problem *robustly*, safeguarding a constraint … Read more