On Optimal Universal First-Order Methods for Minimizing Heterogeneous Sums

This work considers minimizing a convex sum of functions, each with potentially different structure ranging from nonsmooth to smooth, Lipschitz to non-Lipschitz. Nesterov's universal fast gradient method provides an optimal black-box first-order method for minimizing a single function that takes advantage of any continuity structure present without requiring prior knowledge. In this paper, we show that this landmark method (without modification) further adapts to heterogeneous sums. For example, it minimizes the sum of a nonsmooth $M$-Lipschitz function and an $L$-smooth function at a rate of $ O(M^2/\epsilon^2 + \sqrt{L/\epsilon}) $ without knowledge of $M$, $L$, or even that the objective was a sum of two terms. This rate is precisely the sum of the optimal convergence rates for each term's individual complexity class. More generally, we show that sums of varied H\"older smooth functions introduce no new complexities and require at most as many iterations as is needed for minimizing each summand separately. Extensions to strongly convex and H\"older growth settings as well as simple matching lower bounds are also provided.

Article

Download

View On Optimal Universal First-Order Methods for Minimizing Heterogeneous Sums