Coordinate Descent Without Coordinates: Tangent Subspace Descent on Riemannian Manifolds

We extend coordinate descent to manifold domains, and provide convergence analyses for geodesically convex and non-convex smooth objective functions. Our key insight is to draw an analogy between coordinate blocks in Euclidean space and tangent subspaces of a manifold. Hence, our method is called tangent subspace descent (TSD). The core principle behind ensuring convergence of … Read more

The condition number of a function relative to a set

The condition number of a differentiable convex function, namely the ratio of its smoothness to strong convexity constants, is closely tied to fundamental properties of the function. In particular, the condition number of a quadratic convex function is the square of the aspect ratio of a canonical ellipsoid associated to the function. Furthermore, the condition … Read more

A unified framework for Bregman proximal methods: subgradient, gradient, and accelerated gradient schemes

We provide a unified framework for analyzing the convergence of Bregman proximal first-order algorithms for convex minimization. Our framework hinges on properties of the convex conjugate and gives novel proofs of the convergence rates of the Bregman proximal subgradient, Bregman proximal gradient, and a new accelerated Bregman proximal gradient algorithm under fairly general and mild … Read more

The condition of a function relative to a polytope

The condition number of a smooth convex function, namely the ratio of its smoothness to strong convexity constants, is closely tied to fundamental properties of the function. In particular, the condition number of a quadratic convex function is precisely the square of the diameter-to-width ratio of a canonical ellipsoid associated to the function. Furthermore, the … Read more

Convergence rates of proximal gradient methods via the convex conjugate

We give a novel proof of the $O(1/k)$ and $O(1/k^2)$ convergence rates of the proximal gradient and accelerated proximal gradient methods for composite convex minimization. The crux of the new proof is an upper bound constructed via the convex conjugate of the objective function. Citation Technical Report, Carnegie Mellon University, January 2018. Article Download View … Read more