Two decades of blackbox optimization applications

This work reviews blackbox optimization applications over the last twenty years, addressed using direct search optimization methods. Emphasis is placed on the Mesh Adaptive Direct Search (MADS) derivative-free optimization algorithm. The core of the document describes applications in three specific fields: Energy, materials science, and computational engineering design. Other applications in science and engineering as … Read more

DMulti-MADS: Mesh adaptive direct multisearch for blackbox multiobjective optimization

The context of this research is multiobjective optimization where conflicting objectives are present. In this work, these objectives are only available as the outputs of a blackbox for which no derivative information is available. This work proposes a new extension of the mesh adaptive direct search (MADS) algorithm to constrained multiobjective derivative-free optimization. This method … Read more

Computational study of a branching algorithm for the maximum k-cut problem

This work considers the graph partitioning problem known as maximum k-cut. It focuses on investigating features of a branch-and-bound method to efficiently obtain global solutions. An exhaustive experimental study is carried out for two main components of a branch-and-bound algorithm: computing bounds and branching strategies. In particular, we propose the use of a variable neighborhood … Read more

Optimization of noisy blackboxes with adaptive precision

In derivative-free and blackbox optimization, the objective function is often evaluated through the execution of a computer program seen as a blackbox. It can be noisy, in the sense that its outputs are contaminated by random errors. Sometimes, the source of these errors is identified and controllable, in the sense that it is possible to … Read more

Stochastic mesh adaptive direct search for blackbox optimization using probabilistic estimates

We present a stochastic extension of the mesh adaptive direct search (MADS) algorithm originally developed for deterministic blackbox optimization. The algorithm, called StoMADS, considers the unconstrained optimization of an objective function f whose values can be computed only through a blackbox corrupted by some random noise following an unknown distribution. The proposed method is based … Read more

HyperNOMAD: Hyperparameter optimization of deep neural networks using mesh adaptive direct search

The performance of deep neural networks is highly sensitive to the choice of the hyperparameters that define the structure of the network and the learning process. When facing a new application, tuning a deep neural network is a tedious and time consuming process that is often described as a “dark art”. This explains the necessity … Read more

CONICOPF: Conic relaxations for AC optimal power flow computations

Computational speed and global optimality are key needs for practical algorithms for the optimal power flow problem. Two convex relaxations offer a favorable trade-off between the standard second-order cone and the standard semidefinite relaxations for large-scale meshed networks in terms of optimality gap and computation time: the tight-and-cheap relaxation (TCR) and the quadratic convex relaxation … Read more

A Framework for Peak Shaving Through the Coordination of Smart Homes

In demand–response programs, aggregators balance the needs of generation companies and end-users. This work proposes a two-phase framework that shaves the aggregated peak loads while maintaining the desired comfort level for users. In the first phase, the users determine their planned consumption. For the second phase, we develop a bilevel model with mixed-integer variables and … Read more

A general framework for customized transition to smart homes

Smart homes have the potential to achieve efficient energy consumption: households can profit from appropriately scheduled consumption. By 2020, 35% of all households in North America and 20% in Europe are expected to become smart homes. Developing a smart home requires considerable investment, and the householders expect a positive return. In this context, we address … Read more

Performance indicators in multiobjective optimization

In recent years, the development of new algorithms for multiobjective optimization has considerably grown. A large number of performance indicators has been introduced to measure the quality of Pareto front approximations produced by these algorithms. In this work, we propose a review of a total of 63 performance indicators partitioned into four groups according to … Read more