Convex Optimization with ALADIN

This paper presents novel convergence results for the Augmented Lagrangian based Alternating Direction Inexact Newton method (ALADIN) in the context of distributed convex optimization. It is shown that ALADIN converges for a large class of convex optimization problems from any starting point to minimizers without needing line-search or other globalization routines. Under additional regularity assumptions, … Read more

Fixing and extending some recent results on the ADMM algorithm

We first point out several flaws in the recent paper {\it [R. Shefi, M. Teboulle: Rate of convergence analysis of decomposition methods based on the proximal method of multipliers for convex minimization, SIAM J. Optim. 24, 269–297, 2014]} that proposes two ADMM-type algorithms for solving convex optimization problems involving compositions with linear operators and show … Read more

Expander Graph and Communication-Efficient Decentralized Optimization

In this paper, we discuss how to design the graph topology to reduce the communication complexity of certain algorithms for decentralized optimization. Our goal is to minimize the total communication needed to achieve a prescribed accuracy. We discover that the so-called expander graphs are near-optimal choices. We propose three approaches to construct expander graphs for … Read more

Adaptive Accelerated Gradient Converging Methods under Holderian Error Bound Condition

In this paper, we focus our study on the convergence of (proximal) gradient methods and accelerated (proximal) gradient methods for smooth (composite) optimization under a H\”{o}lderian error bound (HEB) condition. We first show that proximal gradient (PG) method is automatically adaptive to HEB while accelerated proximal gradient (APG) method can be adaptive to HEB by … Read more

Extending the ergodic convergence rate of the proximal ADMM

Pointwise and ergodic iteration-complexity results for the proximal alternating direction method of multipliers (ADMM) for any stepsize in $(0,(1+\sqrt{5})/2)$ have been recently established in the literature. In addition to giving alternative proofs of these results, this paper also extends the ergodic iteration-complexity result to include the case in which the stepsize is equal to $(1+\sqrt{5})/2$. … Read more

How to project onto extended second order cones

The extended second order cones were introduced by S. Z. Németh and G. Zhang in [S. Z. Németh and G. Zhang. Extended Lorentz cones and variational inequalities on cylinders. J. Optim. Theory Appl., 168(3):756-768, 2016] for solving mixed complementarity problems and variational inequalities on cylinders. R. Sznajder in [R. Sznajder. The Lyapunov rank of extended … Read more

Multilevel Optimization Methods: Convergence and Problem Structure

Building upon multigrid methods, the framework of multilevel optimization methods was developed to solve structured optimization problems, including problems in optimal control, image processing, etc. In this paper, we give a broader view of the multilevel framework and establish some connections between multilevel algorithms and the other approaches. An interesting case of the so called … Read more

Empirical Risk Minimization: Probabilistic Complexity and Stepsize Strategy

Empirical risk minimization (ERM) is recognized as a special form in standard convex optimization. When using a first order method, the Lipschitz constant of the empirical risk plays a crucial role in the convergence analysis and stepsize strategies for these problems. We derive the probabilistic bounds for such Lipschitz constants using random matrix theory. We … Read more

Homotopy Smoothing for Non-Smooth Problems with Lower Complexity than O(1/epsilon)

In this paper, we develop a novel {\bf ho}moto{\bf p}y {\bf s}moothing (HOPS) algorithm for solving a family of non-smooth problems that is composed of a non-smooth term with an explicit max-structure and a smooth term or a simple non-smooth term whose proximal mapping is easy to compute. The best known iteration complexity for solving … Read more