Structured Nonconvex and Nonsmooth Optimization: Algorithms and Iteration Complexity Analysis

Nonconvex optimization problems are frequently encountered in much of statistics, business, science and engineering, but they are not yet widely recognized as a technology. A reason for this relatively low degree of popularity is the lack of a well developed system of theory and algorithms to support the applications, as is the case for its … Read more

An Extended Frank-Wolfe Method with “In-Face” Directions, and its Application to Low-Rank Matrix Completion

We present an extension of the Frank-Wolfe method that is designed to induce near-optimal solutions on low-dimensional faces of the feasible region. We present computational guarantees for the method that trade off efficiency in computing near-optimal solutions with upper bounds on the dimension of minimal faces of iterates. We apply our method to the low-rank … Read more

New results on subgradient methods for strongly convex optimization problems with a unified analysis

We develop subgradient- and gradient-based methods for minimizing strongly convex functions under a notion which generalizes the standard Euclidean strong convexity. We propose a unifying framework for subgradient methods which yields two kinds of methods, namely, the Proximal Gradient Method (PGM) and the Conditional Gradient Method (CGM), unifying several existing methods. The unifying framework provides … Read more

The Cyclic Block Conditional Gradient Method for Convex Optimization Problems

In this paper we study the convex problem of optimizing the sum of a smooth function and a compactly supported non-smooth term with a specific separable form. We analyze the block version of the generalized conditional gradient method when the blocks are chosen in a cyclic order. A global sublinear rate of convergence is established … Read more

Conditional Gradient Sliding for Convex Optimization

In this paper, we present a new conditional gradient type method for convex optimization by utilizing a linear optimization (LO) oracle to minimize a series of linear functions over the feasible set. Different from the classic conditional gradient method, the conditional gradient sliding (CGS) algorithm developed herein can skip the computation of gradients from time … Read more

On Lower Complexity Bounds for Large-Scale Smooth Convex Optimization

In this note we present tight lower bounds on the information-based complexity of large-scale smooth convex minimization problems. We demonstrate, in particular, that the k-step Conditional Gradient (a.k.a. Frank-Wolfe) algorithm as applied to minimizing smooth convex functions over the n-dimensional box with n ≥ k is optimal, up to an O(ln n)-factor, in terms of … Read more

New Analysis and Results for the Conditional Gradient Method

We present new results for the conditional gradient method (also known as the Frank-Wolfe method). We derive computational guarantees for arbitrary step-size sequences, which are then applied to various step-size rules, including simple averaging and constant step-sizes. We also develop step-size rules and computational guarantees that depend naturally on the warm-start quality of the initial … Read more

The Complexity of Large-scale Convex Programming under a Linear Optimization Oracle

This paper considers a general class of iterative optimization algorithms, referred to as linear-optimization-based convex programming (LCP) methods, for solving large-scale convex programming (CP) problems. The LCP methods, covering the classic conditional gradient (CG) method (a.k.a., Frank-Wolfe method) as a special case, can only solve a linear optimization subproblem at each iteration. In this paper, … Read more