Deterministic global optimization with trained neural networks: Is the envelope of single neurons worth it?

Optimization problems containing trained neural networks remain challenging due to their nonconvexity. Deterministic global optimization relies on relaxations which should be tight, quickly convergent, and cheap to evaluate. While envelopes of common activation functions have been established for several years, the envelope of an entire neuron had not. Recently, Carrasco and Mu\~{n}oz (arXiv.2410.23362, 2024) proposed … Read more

MUSE-BB: A Decomposition Algorithm for Nonconvex Two-Stage Problems using Strong Multisection Branching

We present MUSE-BB, a branch-and-bound (B&B) based decomposition algorithm for the deterministic global solution of nonconvex two-stage stochastic programming problems. In contrast to three recent decomposition algorithms, which solve this type of problem in a projected form by nesting an inner B&B in an outer B&B on the first-stage variables, we branch on all variables … Read more

Global Dynamic Optimization with Hammerstein-Wiener Models Embedded

Hammerstein-Wiener models constitute a significant class of block-structured dynamic models, as they approximate process nonlinearities on the basis of input-output data without requiring identification of a full nonlinear process model. Optimization problems with Hammerstein-Wiener models embedded are nonconvex, and thus local optimization methods may obtain suboptimal solutions. In this work, we develop a deterministic global … Read more