Recently, general definitions of pattern search methods for both unconstrained and linearly constrained optimization were presented. It was shown under mild conditions, that there exists a subsequence of iterates converging to a stationary point. In the unconstrained case, stronger results are derived under additional assumptions. In this paper, we present three small dimensioned examples showing that these results cannot be strengthened without additional assumptions. First, we show that second order optimality conditions cannot be guaranteed. Second, we show that there can be an accumulation point of the sequence of iterates whose gradient norm is strictly positive. These two examples are also valid for the bound constrained case. Finally, we show that even under the stronger assumptions of the unconstrained case, there can be infinitely many accumulation points.
Technical report 98-24, Department of Computational and Applied Mathematics, Rice University, Houston TX, 1998.
View Convergence Results for Pattern Search Algorithms are Tight