Preconditioning of a Generalized Forward-Backward Splitting and Application to Optimization on Graphs

We present a preconditioning of a generalized forward-backward splitting algorithm for finding a zero of a sum of maximally monotone operators \sum_{i=1}^n A_i + B with B cocoercive, involving only the computation of B and of the resolvent of each A_i separately. This allows in particular to minimize functionals of the form \sum_{i=1}^n g_i + … Read more

Linearly Convergent Away-Step Conditional Gradient for Non-strongly Convex Functions

We consider the problem of minimizing a function, which is the sum of a linear function and a composition of a strongly convex function with a linear transformation, over a compact polyhedral set. Jaggi and Lacoste-Julien [14] showed that the conditional gradient method with away steps employed on the aforementioned problem without the additional linear … Read more

Bridging the Gap Between Multigrid, Hierarchical, and Receding-Horizon Control

We analyze the structure of the Euler-Lagrange conditions of a lifted long-horizon optimal control problem. The analysis reveals that the conditions can be solved by using block Gauss-Seidel schemes and we prove that such schemes can be implemented by solving sequences of short-horizon problems. The analysis also reveals that a receding-horizon control scheme is equivalent … Read more

A Two-Level Approach to Large Mixed-Integer Programs with Application to Cogeneration in Energy-Efficient Buildings

We study a two-stage mixed-integer linear program (MILP) with more than 1 million binary variables in the second stage. We develop a two-level approach by constructing a semi-coarse model (coarsened with respect to variables) and a coarse model (coarsened with respect to both variables and constraints). We coarsen binary variables by selecting a small number … Read more

Extended Formulations for Independence Polytopes of Regular Matroids

We show that the independence polytope of every regular matroid has an extended formulation of size quadratic in the size of its ground set. This generalizes a similar statement for (co-)graphic matroids, which is a simple consequence of Martin’s extended formulation for the spanning-tree polytope. In our construction, we make use of Seymour’s decomposition theorem … Read more

Distributed Gradient Methods with Variable Number of Working Nodes

We consider distributed optimization where $N$ nodes in a connected network minimize the sum of their local costs subject to a common constraint set. We propose a distributed projected gradient method where each node, at each iteration $k$, performs an update (is active) with probability $p_k$, and stays idle (is inactive) with probability $1-p_k$. Whenever … Read more

Stochastic Optimization using a Trust-Region Method and Random Models

In this paper, we propose and analyze a trust-region model-based algorithm for solving unconstrained stochastic optimization problems. Our framework utilizes random models of an objective function $f(x)$, obtained from stochastic observations of the function or its gradient. Our method also utilizes estimates of function values to gauge progress that is being made. The convergence analysis … Read more

An MILP-MINLP decomposition method for the global optimization of a source based model of the multiperiod blending problem

The multiperiod blending problem involves binary variables and bilinear terms, yielding a nonconvex MINLP. In this work we present two major contributions for the global solution of the problem. The rst one is an alternative formulation of the problem. This formulation makes use of redundant constraints that improve the MILP relaxation of the MINLP. The … Read more

Lower Bounds on Complexity of Lyapunov Functions for Switched Linear Systems

We show that for any positive integer $d$, there are families of switched linear systems—in fixed dimension and defined by two matrices only—that are stable under arbitrary switching but do not admit (i) a polynomial Lyapunov function of degree $\leq d$, or (ii) a polytopic Lyapunov function with $\leq d$ facets, or (iii) a piecewise … Read more

New results on subgradient methods for strongly convex optimization problems with a unified analysis

We develop subgradient- and gradient-based methods for minimizing strongly convex functions under a notion which generalizes the standard Euclidean strong convexity. We propose a unifying framework for subgradient methods which yields two kinds of methods, namely, the Proximal Gradient Method (PGM) and the Conditional Gradient Method (CGM), unifying several existing methods. The unifying framework provides … Read more