The evaluation complexity of general nonlinear, possibly nonconvex,constrained optimization is analyzed. It is shown that, under suitable smoothness conditions, an $\epsilon$-approximate first-order critical point of the problem can be computed in order $O(\epsilon^{1-2(p+1)/p})$ evaluations of the problem's function and their first $p$ derivatives. This is achieved by using a two-phases algorithm inspired by Cartis, Gould, and Toint (2011, 2013). It is also shown thatstrong guarantees (in terms of handling degeneracies) on the possiblelimit points of the sequence of iterates generated by this algorithm can be obtained at the cost of increased complexity. At variance with previous results, the $\epsilon$-approximate first-order criticality is defined by satisfying a version of the KKT conditions with an accuracy that does not depend on the size of the Lagrange multipliers.
Citation
naXys Technical Report 08-2015, UNamur, Belgium