Regret Minimization in Stochastic Non-Convex Learning via a Proximal-Gradient Approach

Motivated by applications in machine learning and operations research, we study regret minimization with stochastic first-order oracle feedback in online constrained, and possibly non-smooth, non-convex problems. In this setting, the minimization of external regret is beyond reach, so we focus on a local regret measures defined via a proximal-gradient residual mapping. To achieve no (local) … Read more

An optimal first-order primal-dual gap reduction framework for constrained convex optimization

We introduce an analysis framework for constructing optimal first-order primal-dual methods for the prototypical constrained convex optimization tem- plate. While this class of methods offers scalability advantages in obtaining nu- merical solutions, they have the disadvantage of producing sequences that are only approximately feasible to the problem constraints. As a result, it is theoretically challenging … Read more

A Primal-Dual Algorithmic Framework for Constrained Convex Minimization

We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical constrained convex optimization problem, and rigorously characterize how common structural assumptions affect the numerical efficiency. Our main analysis technique provides a fresh perspective on Nesterov’s excessive gap technique in a structured fashion and unifies it with smoothing and primal-dual methods. For instance, … Read more

An inexact proximal path-following algorithm for constrained convex minimization

Many scientific and engineering applications feature large-scale non-smooth convex minimization problems over convex sets. In this paper, we address an important instance of this broad class where we assume that the non-smooth objective is equipped with a tractable proximity operator and that the convex constraints afford a self-concordant barrier. We provide a new joint treatment … Read more

Composite Self-concordant Minimization

We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function endowed with a computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new … Read more