# On Lower Complexity Bounds for Large-Scale Smooth Convex Optimization

In this note we present tight lower bounds on the information-based complexity of large-scale smooth convex minimization problems. We demonstrate, in particular, that the k-step Conditional Gradient (a.k.a. Frank-Wolfe) algorithm as applied to minimizing smooth convex functions over the n-dimensional box with n ≥ k is optimal, up to an O(ln n)-factor, in terms of information-based complexity.

## Article

Download

View On Lower Complexity Bounds for Large-Scale Smooth Convex Optimization