Improving the linear relaxation of maximum hBccut with semidefinite-based constraints

We consider the maximum $k$-cut problem that involves partitioning the vertex set of a graph into $k$ subsets such that the sum of the weights of the edges joining vertices in different subsets is maximized. The associated semidefinite programming (SDP) relaxation is known to provide strong bounds, but it has a high computational cost. We … Read more

The Mesh Adaptive Direct Search Algorithm for Granular and Discrete Variables

The mesh adaptive direct search (Mads) algorithm is designed for blackbox optimization problems for which the functions defining the objective and the constraints are typically the outputs of a simulation seen as a blackbox. It is a derivative-free optimization method designed for continuous variables and is supported by a convergence analysis based on the Clarke … Read more

A realistic energy optimization model for smart-home appliances

Smart homes have the potential to achieve optimal energy consumption with appropriate scheduling. The control of smart appliances can be based on optimization models, which should be realistic and efficient. However, increased realism also implies an increase in solution time. Many of the optimization models in the literature have limitations on the types of appliances … Read more

Tight-and-cheap conic relaxation for the AC optimal power flow problem

The classical alternating current optimal power flow problem is highly nonconvex and generally hard to solve. Convex relaxations, in particular semidefinite, second-order cone, convex quadratic, and linear relaxations, have recently attracted significant interest. The semidefinite relaxation is the strongest among them and is exact for many cases. However, the computational efficiency for solving large-scale semidefinite … Read more

Efficient solution of quadratically constrained quadratic subproblems within the MADS algorithm

The Mesh Adaptive Direct Search algorithm (MADS) is an iterative method for constrained blackbox optimization problems. One of the optional MADS features is a versatile search step in which quadratic models are built leading to a series of quadratically constrained quadratic subproblems. This work explores different algorithms that exploit the structure of the quadratic models: … Read more

Locally weighted regression models for surrogate-assisted design optimization

Locally weighted regression combines the advantages of polynomial regression and kernel smoothing. We present three ideas for appropriate and effective use of LOcally WEighted Scatterplot Smoothing (LOWESS) models for surrogate optimization. First, a method is proposed to reduce the computational cost of LOWESS models. Second, a local scaling coefficient is introduced to adapt LOWESS models … Read more

Robust optimization of noisy blackbox problems using the Mesh Adaptive Direct Search algorithm

Blackbox optimization problems are often contaminated with numerical noise, and direct search methods such as the Mesh Adaptive Direct Search (MADS) algorithm may get stuck at solutions artificially created by the noise. We propose a way to smooth out the objective function of an unconstrained problem using previously evaluated function evaluations, rather than resampling points. … Read more

A derivative-free trust-region augmented Lagrangian algorithm

We present a new derivative-free trust-region (DFTR) algorithm to solve general nonlinear constrained problems with the use of an augmented Lagrangian method. No derivatives are used, neither for the objective function nor for the constraints. An augmented Lagrangian method, known as an effective tool to solve equality and inequality constrained optimization problems with derivatives, is … Read more

A progressive barrier derivative-free trust-region algorithm for constrained optimization

We study derivative-free constrained optimization problems and propose a trust-region method that builds linear or quadratic models around the best feasible and and around the best infeasible solutions found so far. These models are optimized within a trust region, and the progressive barrier methodology handles the constraints by progressively pushing the infeasible solutions toward the … Read more

Order-based error for managing ensembles of surrogates in derivative-free optimization

We investigate surrogate-assisted strategies for derivative-free optimization using the mesh adaptive direct search (MADS) blackbox optimization algorithm. In particular, we build an ensemble of surrogate models to be used within the search step of MADS, and examine different methods for selecting the best model for a given problem at hand. To do so, we introduce … Read more