Inexact FISTA-like Methods with Adaptive Backtracking

Accelerated proximal gradient methods have become a useful tool in large-scale convex optimization, specially for variational regularization with non-smooth priors. Prevailing convergence analysis considers that users can perform the proximal and the gradient steps exactly. Still, in some practical applications, the proximal or the gradient steps must be computed inexactly, which can harm convergence speed … Read more

Convergence rates of proximal gradient methods via the convex conjugate

We give a novel proof of the $O(1/k)$ and $O(1/k^2)$ convergence rates of the proximal gradient and accelerated proximal gradient methods for composite convex minimization. The crux of the new proof is an upper bound constructed via the convex conjugate of the objective function. CitationTechnical Report, Carnegie Mellon University, January 2018.ArticleDownload View PDF

An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems

The affine rank minimization problem, which consists of finding a matrix of minimum rank subject to linear equality constraints, has been proposed in many areas of engineering and science. A specific rank minimization problem is the matrix completion problem, in which we wish to recover a (low-rank) data matrix from incomplete samples of its entries. … Read more