Efficiency of minimizing compositions of convex functions and smooth maps

We consider the problem of minimizing a sum of a convex function and a composition of a convex function with a smooth map. Important examples include exact penalty formulations of nonlinear programs and nonlinear least squares problems with side constraints. The basic algorithm we rely on is the well-known prox-linear method, which in each iteration solves a regularized subproblem formed by linearizing the smooth map. When the subproblems are solved exactly, the method has the efficiency guarantee $O(\epsilon^{-2})$, akin to gradient descent for smooth minimization. Our contributions are threefold: we (1) derive an inertial prox-linear method that accelerates in presence of convexity, (2) quantify the impact of inexact subproblem solves, and (3) discuss efficiency of smoothing techniques. When the subproblems can only be solved by first-order methods, we show that a simple combination of the prox-linear method, smoothing, and fast-gradient subproblem solves yields a scheme with overall efficiency $O(\epsilon^{-3})$, up to log factors. This appears to be the best known complexity bound for the problem class among first-order methods.

Citation

12/2016

Article

Download

View Efficiency of minimizing compositions of convex functions and smooth maps