An adaptive robust optimization model for parallel machine scheduling

Real-life parallel machine scheduling problems can be characterized by: (i) limited information about the exact task duration at scheduling time, and (ii) an opportunity to reschedule the remaining tasks each time a task processing is completed and a machine becomes idle. Robust optimization is the natural methodology to cope with the first characteristic of duration … Read more

First-order algorithms for robust optimization problems via convex-concave saddle-point Lagrangian reformulation

Robust optimization (RO) is one of the key paradigms for solving optimization problems affected by uncertainty. Two principal approaches for RO, the robust counterpart method and the adversarial approach, potentially lead to excessively large optimization problems. For that reason, first order approaches, based on online-convex-optimization, have been proposed (Ben-Tal et al. (2015), Kilinc-Karzan and Ho-Nguyen … Read more

Generalized Self-Concordant Analysis of Frank-Wolfe algorithms

Projection-free optimization via different variants of the Frank-Wolfe (FW) method has become one of the cornerstones in large scale optimization for machine learning and computational statistics. Numerous applications within these fields involve the minimization of functions with self-concordance like properties. Such generalized self-concordant (GSC) functions do not necessarily feature a Lipschitz continuous gradient, nor are … Read more

A Data-Driven Approach to Multi-Stage Stochastic Linear Optimization

We propose a new data-driven approach for addressing multi-stage stochastic linear optimization problems with unknown distributions. The approach consists of solving a robust optimization problem that is constructed from sample paths of the underlying stochastic process. We provide asymptotic bounds on the gap between the optimal costs of the robust optimization problem and the underlying … Read more

Linearly Convergent Away-Step Conditional Gradient for Non-strongly Convex Functions

We consider the problem of minimizing a function, which is the sum of a linear function and a composition of a strongly convex function with a linear transformation, over a compact polyhedral set. Jaggi and Lacoste-Julien [14] showed that the conditional gradient method with away steps employed on the aforementioned problem without the additional linear … Read more