Decentralized Learning with Lazy and Approximate Dual Gradients

This paper develops algorithms for decentralized machine learning over a network, where data are distributed, computation is localized, and communication is restricted between neighbors. A line of recent research in this area focuses on improving both computation and communication complexities. The methods SSDA and MSDA \cite{scaman2017optimal} have optimal communication complexity when the objective is smooth … Read more

Convergence analysis under consistent error bounds

We introduce the notion of consistent error bound functions which provides a unifying framework for error bounds for multiple convex sets. This framework goes beyond the classical Lipschitzian and Holderian error bounds and includes logarithmic and entropic error bound found in the exponential cone. It also includes the error bounds obtainable under the theory of … Read more

Tight bounds on Lyapunov rank

The Lyapunov rank of a cone is the number of independent equations obtainable from an analogue of the complementary slackness condition in cone programming problems, and more equations are generally thought to be better. Bounding the Lyapunov rank of a proper cone in R^n from above is an open problem. Gowda and Tao gave an … Read more

Proscribed normal decompositions of Euclidean Jordan algebras

Normal decomposition systems unify many results from convex matrix analysis regarding functions that are invariant with respect to a group of transformations—particularly those matrix functions that are unitarily-invariant and the affiliated permutation-invariant “spectral functions” that depend only on eigenvalues. Spectral functions extend in a natural way to Euclidean Jordan algebras, and several authors have studied … Read more

On the strong concavity of the dual function of an optimization problem

We provide three new proofs of the strong concavity of the dual function of some convex optimization problems. For problems with nonlinear constraints, we show that the the assumption of strong convexity of the objective cannot be weakened to convexity and that the assumption that the gradients of all constraints at the optimal solution are … Read more

Memory-efficient structured convex optimization via extreme point sampling

Memory is a key computational bottleneck when solving large-scale convex optimization problems such as semidefinite programs (SDPs). In this paper, we focus on the regime in which storing an n × n matrix decision variable is prohibitive. To solve SDPs in this regime, we develop a randomized algorithm that returns a random vector whose covariance … Read more

A simplified treatment of Ramana’s exact dual for semidefinite programming

In semidefinite programming the dual may fail to attain its optimal value and there could be a duality gap, i.e., the primal and dual optimal values may differ. In a striking paper, Ramana proposed a polynomial size extended dual that does not have these deficiencies and yields a number of fundamental results in complexity theory. … Read more

An Alternative Perspective on Copositive and Convex Relaxations of Nonconvex Quadratic Programs

We study convex relaxations of nonconvex quadratic programs. We identify a family of so-called feasibility preserving convex relaxations, which includes the well-known copositive and doubly nonnegative relaxations, with the property that the convex relaxation is feasible if and only if the nonconvex quadratic program is feasible. We observe that each convex relaxation in this family … Read more

Two novel gradient methods with optimal step sizes

In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems. The proposed step sizes employ second-order information in order to obtain faster gradient-type methods. Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of … Read more

Towards practical generic conic optimization

Many convex optimization problems can be represented through conic extended formulations with auxiliary variables and constraints using only the small number of standard cones recognized by advanced conic solvers such as MOSEK 9. Such extended formulations are often significantly larger and more complex than equivalent conic natural formulations, which can use a much broader class … Read more