Performance Estimation for Smooth and Strongly Convex Sets

We extend recent computer-assisted design and analysis techniques for first-order optimization over structured functions–known as performance estimation–to apply to structured sets. We prove “interpolation theorems” for smooth and strongly convex sets with Slater points and bounded diameter, showing a wide range of extremal questions amount to structured mathematical programs. Prior function interpolation theorems are recovered … Read more

Using Taylor-Approximated Gradients to Improve the Frank-Wolfe Method for Empirical Risk Minimization

The Frank-Wolfe method has become increasingly useful in statistical and machine learning applications, due to the structure-inducing properties of the iterates, and especially in settings where linear minimization over the feasible set is more computationally efficient than projection. In the setting of Empirical Risk Minimization — one of the fundamental optimization problems in statistical and … Read more

Analysis of the Frank-Wolfe Method for Convex Composite Optimization involving a Logarithmically-Homogeneous Barrier

We present and analyze a new generalized Frank-Wolfe method for the composite optimization problem (P): F*:= min_x f(Ax) + h(x), where f is a \theta-logarithmically-homogeneous self-concordant barrier and the function h has bounded domain but is possibly non-smooth. We show that our generalized Frank-Wolfe method requires O((Gap_0 + \theta + Var_h)\ln(\delta_0) + (\theta + Var_h)^2/\epsilon) … Read more

Distributionally Robust Optimization with Markovian Data

We study a stochastic program where the probability distribution of the uncertain problem parameters is unknown and only indirectly observed via finitely many correlated samples generated by an unknown Markov chain with d states. We propose a data-driven distributionally robust optimization model to estimate the problem’s objective function and optimal solution. By leveraging results from … Read more

New complexity results and algorithms for min-max-min robust combinatorial optimization

In this work we investigate the min-max-min robust optimization problem applied to combinatorial problems with uncertain cost-vectors which are contained in a convex uncertainty set. The idea of the approach is to calculate a set of k feasible solutions which are worst-case optimal if in each possible scenario the best of the k solutions would … Read more

FrankWolfe.jl: a high-performance and flexible toolbox for Frank-Wolfe algorithms and Conditional Gradients

We present FrankWolfe.jl, an open-source implementation of several popular Frank-Wolfe and Conditional Gradients variants for first-order constrained optimization. The package is designed with flexibility and high-performance in mind, allowing for easy extension and relying on few assumptions regarding the user-provided functions. It supports Julia’s unique multiple dispatch feature, and interfaces smoothly with generic linear optimization … Read more

Fast cluster detection in networks by first-order optimization

Cluster detection plays a fundamental role in the analysis of data. In this paper, we focus on the use of s-defective clique models for network-based cluster detection and propose a nonlinear optimization approach that efficiently handles those models in practice. In particular, we introduce an equivalent continuous formulation for the problem under analysis, and we … Read more

A unifying framework for the analysis of projection-free first-order methods under a sufficient slope condition

The analysis of projection-free first order methods is often complicated by the presence of different kinds of “good” and “bad” steps. In this article, we propose a unifying framework for projection-free methods, aiming to simplify the converge analysis by getting rid of such a distinction between steps. The main tool employed in our framework is … Read more

Generalized Stochastic Frank-Wolfe Algorithm with Stochastic “Substitute” Gradient for Structured Convex Optimization

The stochastic Frank-Wolfe method has recently attracted much general interest in the context of optimization for statistical and machine learning due to its ability to work with a more general feasible region. However, there has been a complexity gap in the guaranteed convergence rate for stochastic Frank-Wolfe compared to its deterministic counterpart. In this work, … Read more

On a Frank-Wolfe Type Theorem in Cubic Optimization

A classical result due to Frank and Wolfe (1956) says that a quadratic function $f$ attains its supremum on a nonempty polyhedron $M$ if $f$ is bounded from above on $M$. In this note, we present a stringent proof of the extension of this result to cubic optimization (known from Andronov, Belousov and Shironin (1982)). … Read more