Spurious Local Minima Exist for Almost All Over-parameterized Neural Networks

A popular belief for explaining the efficiency in training deep neural networks is that over-paramenterized neural networks have nice landscape. However, it still remains unclear whether over-parameterized neural networks contain spurious local minima in general, since all current positive results cannot prove non-existence of bad local minima, and all current negative results have strong restrictions … Read more

Nonlinear Optimization of District Heating Networks

We develop a complementarity-constrained nonlinear optimization model for the time-dependent control of district heating networks. The main physical aspects of water and heat flow in these networks are governed by nonlinear and hyperbolic 1d partial differential equations. In addition, a pooling-type mixing model is required at the nodes of the network to treat the mixing … Read more

Solving Large Scale Cubic Regularization by a Generalized Eigenvalue Problem

Cubic Regularization methods have several favorable properties. In particular under mild assumptions, they are globally convergent towards critical points with second order necessary conditions satisfied. Their adoption among practitioners, however, does not yet match the strong theoretical results. One of the reasons for this discrepancy may be additional implementation complexity needed to solve the occurring … Read more

Branch-and-cut-and-price for the Cardinality-constrained Multi-cycle Problem in Kidney Exchange

The establishment of kidney exchange programs has dramatically improved rates for kidney transplants by matching donors to compatible patients who would otherwise fail to receive a kidney for transplant. Rather than simply swapping kidneys between two patient-donor pairs, having multiple patient-donors pairs simultaneously donate kidneys in a cyclic manner enables all participants to receive a … Read more

Improving sample average approximation using distributional robustness

We consider stochastic optimization problems in which we aim to minimize the expected value of an objective function with respect to an unknown distribution of random parameters. We analyse the out-of-sample performance of solutions obtained by solving a distributionally robust version of the sample average approximation problem for unconstrained quadratic problems, and derive conditions under … Read more

Joint chance-constrained programs and the intersection of mixing sets through a submodularity lens

A particularly important substructure in modeling joint linear chance-constrained programs with random right-hand sides and finite sample space is the intersection of mixing sets with common binary variables (and possibly a knapsack constraint). In this paper, we first revisit basic mixing sets by establishing a strong and previously unrecognized connection to submodularity. In particular, we … Read more

Optimal Crashing of an Activity Network with Disruptions

In this paper, we consider an optimization problem involving crashing an activity network under a single disruption. A disruption is an event whose magnitude and timing are random. When a disruption occurs the duration of an activity, which has not yet started, can change. We formulate a two-stage stochastic mixed integer program, in which the … Read more

Stochastic mesh adaptive direct search for blackbox optimization using probabilistic estimates

We present a stochastic extension of the mesh adaptive direct search (MADS) algorithm originally developed for deterministic blackbox optimization. The algorithm, called StoMADS, considers the unconstrained optimization of an objective function f whose values can be computed only through a blackbox corrupted by some random noise following an unknown distribution. The proposed method is based … Read more

The Outcome Range Problem in Interval Linear Programming

Quantifying extra functions, herein referred to as outcome functions, over optimal solutions of an optimization problem can provide decision makers with additional information on a system. This bears more importance when the optimization problem is subject to uncertainty in input parameters. In this paper, we consider linear programming problems in which input parameters are described … Read more

Dual-density-based reweighted $\ell_{1}hBcalgorithms for a class of $\ell_{0}hBcminimization problems

The optimization problem with sparsity arises in many areas of science and engineering such as compressed sensing, image processing, statistical learning and data sparse approximation. In this paper, we study the dual-density-based reweighted $\ell_{1}$-algorithms for a class of $\ell_{0}$-minimization models which can be used to model a wide range of practical problems. This class of … Read more