Calibration by Optimization Without Using Derivatives

Applications in engineering frequently require the adjustment of certain parameters. While the mathematical laws that determine these parameters often are well understood, due to time limitations in every day industrial life, it is typically not feasible to derive an explicit computational procedure for adjusting the parameters based on some given measurement data. This paper aims … Read more

On the optimal order of worst case complexity of direct search

The worst case complexity of direct-search methods has been recently analyzed when they use positive spanning sets and impose a sufficient decrease condition to accept new iterates. Assuming that the objective function is smooth, it is now known that such methods require at most O(n^2 epsilon^{-2}) function evaluations to compute a gradient of norm below … Read more

Direct search based on probabilistic descent

Direct-search methods are a class of popular derivative-free algorithms characterized by evaluating the objective function using a step size and a number of (polling) directions. When applied to the minimization of smooth functions, the polling directions are typically taken from positive spanning sets which in turn must have at least n+1 vectors in an n-dimensional … Read more

GLODS: Global and Local Optimization using Direct Search

Locating and identifying points as global minimizers is, in general, a hard and time-consuming task. Difficulties increase when the derivatives of the functions defining the problem are not available for use. In this work, we propose a new class of methods suited for global derivative-free constrained optimization. Using direct search of directional type, the algorithm … Read more

Worst case complexity of direct search under convexity

In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same global rate or worst case complexity bound of the gradient method for the unconstrained minimization of a convex and smooth function. More precisely, it will be shown that … Read more

A merit function approach for direct search

In this paper it is proposed to equip direct-search methods with a general procedure to minimize an objective function, possibly non-smooth, without using derivatives and subject to constraints on the variables. One aims at considering constraints, most likely nonlinear or non-smooth, for which the derivatives of the corresponding functions are also unavailable. The novelty of … Read more

Smoothing and Worst Case Complexity for Direct-Search Methods in Non-Smooth Optimization

For smooth objective functions it has been shown that the worst case cost of direct-search methods is of the same order as the one of steepest descent, when measured in number of iterations to achieve a certain threshold of stationarity. Motivated by the lack of such a result in the non-smooth case, we propose, analyze, … Read more

A Bilevel Direct Search Method for Leader-Follower Optimization Problems and Applications

In the paper, we propose a bilevel direct search method for solving a type of leader-follower problems with each decision maker’s objective being a “black-box” function. First, we give a description for a leader-follower optimization problem. Then, we investigate a bilevel direct search method including two algorithms for combinatorially solving the upper and lower level … Read more

Worst Case Complexity of Direct Search

In this paper we prove that direct search of directional type shares the worst case complexity bound of steepest descent when sufficient decrease is imposed using a quadratic function of the step size parameter. This result is proved under smoothness of the objective function and using a framework of the type of GSS (generating set … Read more

Direct Multisearch for Multiobjective Optimization

In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of black-box type, preventing the use of derivative-based techniques. We propose a novel multiobjective derivative-free methodology, calling it direct multisearch (DMS), which does not aggregate any of the objective … Read more