A second-order globally convergent direct-search method and its worst-case complexity

Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the presence of smoothness, first-order global convergence comes from the ability of the vectors to approximate the steepest descent direction, which can be quantified by a first-order criticality (cosine) measure. The use of a set of vectors with a positive cosine measure together with the imposition of a sufficient decrease condition to accept new iterates leads to a convergence result as well as a worst-case complexity bound. In this paper, we present a second-order study of a general class of direct-search methods. We start by proving a weak second-order convergence result related to a criticality measure defined along the directions used throughout the iterations. Extensions of this result to obtain a true second-order optimality one are discussed, one possibility being a method using approximate Hessian eigenvectors as directions (which is proved to be truly second-order globally convergent). Numerically guaranteeing such a convergence can be rather expensive to ensure, as it is indicated by the worst-case complexity analysis provided in this paper, but turns out to be appropriate for some pathological examples.

Citation

S. Gratton, C. W. Royer, and L. N. Vicente, A second-order globally convergent direct-search method and its worst-case complexity, preprint 15-27, Dept. Mathematics, Univ. Coimbra.

Article

Download

View A second-order globally convergent direct-search method and its worst-case complexity