Optimization Algorithms for Data Analysis

We describe the fundamentals of algorithms for minimizing a smooth nonlinear function, and extensions of these methods to the sum of a smooth function and a convex nonsmooth function. Such objective functions are ubiquitous in data analysis applications, as we illustrate using several examples. We discuss methods that make use of gradient (first-order) information about … Read more

An optimal randomized incremental gradient method

In this paper, we consider a class of finite-sum convex optimization problems whose objective function is given by the summation of $m$ ($\ge 1$) smooth components together with some other relatively simple terms. We first introduce a deterministic primal-dual gradient (PDG) method that can achieve the optimal black-box iteration complexity for solving these composite optimization … Read more

Gradient Sliding for Composite Optimization

We consider in this paper a class of composite optimization problems whose objective function is given by the summation of a general smooth and nonsmooth component, together with a relatively simple nonsmooth term. We present a new class of first-order methods, namely the gradient sliding algorithms, which can skip the computation of the gradient for … Read more

The Maximum Box Problem and its Application to Data Analysis

Given two finite sets of points $X^+$ and $X^-$ in $\R^n$, the maximum box problem consists in finding an interval (“box”) $B=\{x : l \leq x \leq u\}$ such that $B\cap X^-=\emptyset$, and the cardinality of $B\cap X^+$ is maximized. A simple generalization can be obtained by instead maximizing a weighted sum of the elements … Read more