Smoothing and Worst Case Complexity for Direct-Search Methods in Non-Smooth Optimization

For smooth objective functions it has been shown that the worst case cost of direct-search methods is of the same order as the one of steepest descent, when measured in number of iterations to achieve a certain threshold of stationarity. Motivated by the lack of such a result in the non-smooth case, we propose, analyze, … Read more

Globally Convergent Evolution Strategies and CMA-ES

In this paper we show how to modify a large class of evolution strategies (ES) to rigorously achieve a form of global convergence, meaning convergence to stationary points independently of the starting point. The type of ES under consideration recombine the parents by means of a weighted sum, around which the offsprings are computed by … Read more

A surrogate management framework using rigorous trust-regions steps

Surrogate models and heuristics are frequently used in the optimization engineering community as convenient approaches to deal with functions for which evaluations are expensive or noisy, or lack convexity. These methodologies do not typically guarantee any type of convergence under reasonable assumptions and frequently render slow convergence. In this paper we will show how to … Read more

Inexact solution of NLP subproblems in MINLP

In the context of convex mixed-integer nonlinear programming (MINLP), we investigate how the outer approximation method and the generalized Benders decomposition method are affected when the respective NLP subproblems are solved inexactly. We show that the cuts in the corresponding master problems can be changed to incorporate the inexact residuals, still rendering equivalence and finiteness … Read more

On partially sparse recovery

In this paper we consider the problem of recovering a partially sparse solution of an underdetermined system of linear equations by minimizing the l1-norm of the part of the solution vector which is known to be sparse. Such a problem is closely related to the classical problem in Compressed Sensing where the l1-norm of the … Read more

Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization

Interpolation-based trust-region methods are an important class of algorithms for Derivative-Free Optimization which rely on locally approximating an objective function by quadratic polynomial interpolation models, frequently built from less points than there are basis components. Often, in practical applications, the contribution of the problem variables to the objective function is such that many pairwise correlations … Read more

Bilevel Derivative-Free Optimization and its Application to Robust Optimization

We address bilevel programming problems when the derivatives of both the upper and the lower level objective functions are unavailable. The core algorithms used for both levels are trust-region interpolation-based methods, using minimum Frobenius norm quadratic models when the number of points is smaller than the number of basis components. We take advantage of the … Read more

Worst Case Complexity of Direct Search

In this paper we prove that direct search of directional type shares the worst case complexity bound of steepest descent when sufficient decrease is imposed using a quadratic function of the step size parameter. This result is proved under smoothness of the objective function and using a framework of the type of GSS (generating set … Read more

Direct Multisearch for Multiobjective Optimization

In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of black-box type, preventing the use of derivative-based techniques. We propose a novel multiobjective derivative-free methodology, calling it direct multisearch (DMS), which does not aggregate any of the objective … Read more

Analysis of direct searches for non-Lipschitzian functions

It is known that the Clarke generalized directional derivative is nonnegative along the limit directions generated by directional direct-search methods at a limit point of certain subsequences of unsuccessful iterates, if the function being minimized is Lipschitz continuous near the limit point. In this paper we generalize this result for non-Lipschitzian functions using Rockafellar generalized … Read more