Trust your data or not – StQP remains StQP: Community Detection via Robust Standard Quadratic Optimization

We consider the Robust Standard Quadratic Optimization Problem (RStQP), in which an uncertain (possibly indefinite) quadratic form is extremized over the standard simplex. Following most approaches, we model the uncertainty sets by ellipsoids, polyhedra, or spectrahedra, more precisely, by intersections of sub-cones of the copositive matrix cone. We show that the copositive relaxation gap of … Read more

Lectures on Parametric Optimization: An Introduction

The report aims to provide an overview over results from Parametric Optimization which could be called classical results on the subject. Parametric Optimization considers optimization problems depending on a parameter and describes how the feasible set, the value function, and the local or global minimizers of the program depend on changes in the parameter. After … Read more

Improved Regularity Assumptions for Partial Outer Convexification of Mixed-Integer PDE-Constrained Optimization problems

Partial outer convexification is a relaxation technique for MIOCPs being constrained by time-dependent differential equations. Sum-Up-Rounding algorithms allow to approximate feasible points of the relaxed, convexified continuous problem with binary ones that are feasible up to an arbitrarily small $\delta > 0$. We show that this approximation property holds for ODEs and semilinear PDEs under … Read more

Representation of distributionally robust chance-constraints

Given $X\subset R^n$, $\varepsilon \in (0,1)$, a parametrized family of probability distributions $(\mu_{a})_{a\in A}$ on $\Omega\subset R^p$, we consider the feasible set $X^*_\varepsilon\subset X$ associated with the {\em distributionally robust} chance-constraint \[X^*_\varepsilon\,=\,\{x\in X:\:{\rm Prob}_\mu[f(x,\omega)\,>\,0]> 1-\varepsilon,\,\forall\mu\in\mathscr{M}_a\},\] where $\mathscr{M}_a$ is the set of all possibles mixtures of distributions $\mu_a$, $a\in A$. For instance and typically, the family … Read more

Approximation Properties of Sum-Up Rounding in the Presence of Vanishing Constraints

Approximation algorithms like sum-up rounding that allow to compute integer-valued approximations of the continuous controls in a weak$^*$ sense have attracted interest recently. They allow to approximate (optimal) feasible solutions of continuous relaxations of mixed-integer control problems (MIOCPs) with integer controls arbitrarily close. To this end, they use compactness properties of the underlying state equation, … Read more

Complexity of gradient descent for multiobjective optimization

A number of first-order methods have been proposed for smooth multiobjective optimization for which some form of convergence to first order criticality has been proved. Such convergence is global in the sense of being independent of the starting point. In this paper we analyze the rate of convergence of gradient descent for smooth unconstrained multiobjective … Read more

Block Coordinate Proximal Gradient Method for Nonconvex Optimization Problems: Convergence Analysis

We propose a block coordinate proximal gradient method for a composite minimization problem with two nonconvex function components in the objective while only one of them is assumed to be differentiable. Under some per-block Lipschitz-like conditions based on Bregman distance, but without the global Lipschitz continuity of the gradient of the differentiable function, we prove … Read more

A Merit Function Approach for Evolution Strategies

In this paper, we extend a class of globally convergent evolution strategies to handle general constrained optimization problems. The proposed framework handles relaxable constraints using a merit function approach combined with a specific restoration procedure. The unrelaxable constraints in our framework, when present, are treated either by using the extreme barrier function or through a … Read more

Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers

We present DFO-LS, a software package for derivative-free optimization (DFO) for nonlinear Least-Squares (LS) problems, with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals. DFO-LS allows flexible initialization for expensive problems, whereby it can begin making progress from as few as two objective evaluations. Numerical results … Read more

Derivative-Free Superiorization With Component-Wise Perturbations

Superiorization reduces, not necessarily minimizes, the value of a target function while seeking constraints-compatibility. This is done by taking a solely feasibility-seeking algorithm, analyzing its perturbations resilience, and proactively perturbing its iterates accordingly to steer them toward a feasible point with reduced value of the target function. When the perturbation steps are computationally efficient, this … Read more