On approximate KKT condition and its extension to continuous variational inequalities

In this work we introduce a necessary natural sequential Approximate-Karush-Kuhn-Tucker (AKKT) condition for a point to be a solution of a continuous variational inequality problem without constraint quali cations, and we prove its relation with the Approximate Gradient Projection condition (AGP) of Garciga-Otero and Svaiter. We also prove that a slight variation of the AKKT condition … Read more

On the global convergence of interior-point nonlinear programming algorithms

Carathéodory’s lemma states that if we have a linear combination of vectors in R^n, we can rewrite this combination using a linearly independent subset. This result has been successfully applied in nonlinear optimization in many contexts. In this work we present a new version of this celebrated theorem, in which we obtained new bounds for … Read more

On sequential optimality conditions for smooth constrained optimization

Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Approximate KKT and Approximate Gradient Projection conditions are analyzed in this work. These conditions are not necessarily equivalent. Implications between different conditions and counter-examples will be shown. Algorithmic consequences will be discussed. ArticleDownload View PDF