Empirical Risk Minimization: Probabilistic Complexity and Stepsize Strategy

Empirical risk minimization (ERM) is recognized as a special form in standard convex optimization. When using a first order method, the Lipschitz constant of the empirical risk plays a crucial role in the convergence analysis and stepsize strategies for these problems. We derive the probabilistic bounds for such Lipschitz constants using random matrix theory. We … Read more

Homotopy Smoothing for Non-Smooth Problems with Lower Complexity than O(1/epsilon)

In this paper, we develop a novel {\bf ho}moto{\bf p}y {\bf s}moothing (HOPS) algorithm for solving a family of non-smooth problems that is composed of a non-smooth term with an explicit max-structure and a smooth term or a simple non-smooth term whose proximal mapping is easy to compute. The best known iteration complexity for solving … Read more

RSG: Beating Subgradient Method without Smoothness and Strong Convexity

In this paper, we study the efficiency of a {\bf R}estarted {\bf S}ub{\bf G}radient (RSG) method that periodically restarts the standard subgradient method (SG). We show that, when applied to a broad class of convex optimization problems, RSG method can find an $\epsilon$-optimal solution with a low complexity than SG method. In particular, we first … Read more

Multilevel Optimization Methods: Convergence and Problem Structure

Building upon multigrid methods, the framework of multilevel optimization methods was developed to solve structured optimization problems, including problems in optimal control, image processing, etc. In this paper, we give a broader view of the multilevel framework and establish some connections between multilevel algorithms and the other approaches. An interesting case of the so called … Read more