Pattern search methods can be made more efficient if past function evaluations are appropriately reused. In this paper we will introduce a number of ways of reusing previous evaluations of the objective function based on the computation of simplex derivatives (e.g., simplex gradients) to improve the efficiency of a pattern search iteration. At each iteration of a pattern search method, one can attempt to compute an accurate simplex gradient by identifying a sampling set of previous iterates with good geometrical properties. This simplex gradient computation can be done using only past successful iterates or by considering all past function evaluations. The simplex gradient can then be used, for instance, to reorder the evaluations of the objective function associated with the positive spanning set or positive basis used in the poll step. But it can also be used to update the mesh size parameter according to a sufficient decrease criterion. None of these modifications demands new function evaluations. A search step can also be tried along the negative simplex gradient at the beginning of the current pattern search iteration. We will present these procedures in detail and show how promising they are to enhance the practical performance of pattern search methods.
Citation
Preprint 04-35, Department of Mathematics, University of Coimbra