An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and its Implications to Second-Order Methods

This paper presents an accelerated variant of the hybrid proximal extragradient (HPE) method for convex optimization, referred to as the accelerated HPE (A-HPE) method. Iteration-complexity results are established for the A-HPE method, as well as a special version of it, where a large stepsize condition is imposed. Two specific implementations of the A-HPE method are described in the context of a structured convex optimization problem whose objective function consists of the sum of a smooth convex function and an extended real-valued non-smooth convex function. In the first implementation, a generalization of a variant of Nesterov’s method is obtained for the case where the smooth component of the objective function has Lipschitz continuous gradient. In the second implementation, an accelerated Newton proximal extragradient (A-NPE) method is obtained for the case where the smooth component of the objective function has Lipschitz continuous Hessian. It is shown that the A-NPE method has a ${\cal O}(1/k^{7/2})$ convergence rate, which improves upon the ${\cal O}(1/k^3)$ convergence rate bound for another accelerated Newton-type method presented by Nesterov. Finally, while Nesterov’s method is based on exact solutions of subproblems with cubic regularization terms, the A-NPE method is based on inexact solutions of subproblems with quadratic regularization terms, and hence is potentially more tractable from a computational point of view.

Article

Download

View PDF