Derivative-free optimization methods

In many optimization problems arising from scientific, engineering and artificial intelligence applications, objective and constraint functions are available only as the output of a black-box or simulation oracle that does not provide derivative information. Such settings necessitate the use of methods for derivative-free, or zeroth-order, optimization. We provide a review and perspectives on developments in … Read more

The Quadratic Cycle Cover Problem: special cases and efficient bounds

The quadratic cycle cover problem is the problem of finding a set of node-disjoint cycles visiting all the nodes such that the total sum of interaction costs between incident arcs is minimized. In this paper we study the linearization problem for the quadratic cycle cover problem and related lower bounds. In particular, we derive various … Read more

Discrete Optimization Methods for Group Model Selection in Compressed Sensing

In this article we study the problem of signal recovery for group models. More precisely for a given set of groups, each containing a small subset of indices, and for given linear sketches of the true signal vector which is known to be group-sparse in the sense that its support is contained in the union … Read more

Logarithmic-Barrier Decomposition Interior-Point Methods for Stochastic Linear Optimization in a Hilbert Space

Several logarithmic-barrier interior-point methods are now available for solving two-stage stochastic optimization problems with recourse in the finite-dimensional setting. However, despite the genuine need for studying such methods in general spaces, there are no infinite-dimensional analogs of these methods. Inspired by this evident gap in the literature, in this paper, we propose logarithmic-barrier decomposition-based interior-point … Read more

Identifying Effective Scenarios for Sample Average Approximation

We introduce a method to improve the tractability of the well-known Sample Average Approximation (SAA) without compromising important theoretical properties, such as convergence in probability and the consistency of an independent and identically distributed (iid) sample. We consider each scenario as a polyhedron of the mix of first-stage and second-stage decision variables. According to John’s … Read more

Tractable semi-algebraic approximation using Christoffel-Darboux kernel

We provide a new method to approximate a (possibly discontinuous) function using Christoffel-Darboux kernels. Our knowledge about the unknown multivariate function is in terms of finitely many moments of the Young measure supported on the graph of the function. Such an input is available when approximating weak (or measure-valued) solution of optimal control problems, entropy … Read more

Error estimates for iterative algorithms for minimizing regularized quadratic subproblems

We derive bounds for the objective errors and gradient residuals when finding approximations to the solution of common regularized quadratic optimization problems within evolving Krylov spaces. These provide upper bounds on the number of iterations required to achieve a given stated accuracy. We illustrate the quality of our bounds on given test examples. CitationTechnical Report … Read more

Planning Out-of-Hours Services for Pharmacies

The supply of pharmaceuticals is one important factor in a functioning health care system. In the German health care system, the chambers of pharmacists are legally obliged to ensure that every resident can find an open pharmacy at any day and night time within an appropriate distance. To that end, the chambers of pharmacists create … Read more

A Delayed Weighted Gradient Method for Strictly Convex Quadratic Minimization

This paper develops an accelerated version of the steepest descent method by a two-step iteration. The new algorithm uses information with delay to define the iterations. Specifically, in the first step, a prediction of the new test point is calculated by using the gradient method with the exact minimal gradient steplength and then, a correction … Read more

Partially observable multistage stochastic programming

We propose a class of partially observable multistage stochastic programs and describe an algorithm for solving this class of problems. We provide a Bayesian update of a belief-state vector, extend the stochastic programming formulation to incorporate the belief state, and characterize saddle-function properties of the corresponding cost-to-go function. Our algorithm is a derivative of the … Read more