On Second Order Optimality Conditions in Nonlinear Optimization

In this work we present new weak conditions that ensure the validity of necessary second order optimality conditions (SOC) for nonlinear optimization. We are able to prove that weak and strong SOCs hold for all Lagrange multipliers using Abadie-type assumptions. We also prove weak and strong SOCs for at least one Lagrange multiplier imposing the … Read more

A Flexible Inexact Restoration Method and Application to Optimization with Multiobjective Constraints under Weighted-Sum Scalarization

We introduce a new flexible Inexact-Restoration (IR) algorithm and an application to problems with multiobjective constraints (MOCP) under the weighted-sum scalarization approach. In IR methods each iteration has two phases. In the first phase one aims to improve the feasibility and, in the second phase, one minimizes a suitable objective function. This is done in … Read more

Approximate-KKT stopping criterion when Lagrange multipliers are not available

In this paper we investigate how to efficiently apply Approximate-Karush-Kuhn-Tucker (AKKT) proximity measures as stopping criteria for optimization algorithms that do not generate approximations to Lagrange multipliers, in particular, Genetic Algorithms. We prove that for a wide range of constrained optimization problems the KKT error measurement tends to zero. We also develop a simple model … Read more

Primal-dual relationship between Levenberg-Marquardt and central trajectories for linearly constrained convex optimization

We consider the minimization of a convex function on a compact polyhedron defined by linear equality constraints and nonnegative variables. We define the Levenberg-Marquardt (L-M) and central trajectories starting at the analytic center and using the same parameter, and show that they satisfy a primal-dual relationship, being close to each other for large values of … Read more

Two new weak constraint qualifications and applications

We present two new constraint qualifications (CQ) that are weaker than the recently introduced Relaxed Constant Positive Linear Depen- dence (RCPLD) constraint qualification. RCPLD is based on the assump- tion that many subsets of the gradients of the active constraints preserve positive linear dependence locally. A major open question was to identify the exact set … Read more

A relaxed constant positive linear dependence constraint qualification and applications

In this work we introduce a relaxed version of the constant positive linear dependence constraint qualification (CPLD) that we call RCPLD. This development is inspired by a recent generalization of the constant rank constraint qualification from Minchenko and Stakhovski that was called RCR. We show that RCPLD is enough to ensure the convergence of an … Read more

On approximate KKT condition and its extension to continuous variational inequalities

In this work we introduce a necessary natural sequential Approximate-Karush-Kuhn-Tucker (AKKT) condition for a point to be a solution of a continuous variational inequality problem without constraint quali cations, and we prove its relation with the Approximate Gradient Projection condition (AGP) of Garciga-Otero and Svaiter. We also prove that a slight variation of the AKKT condition … Read more

On the global convergence of interior-point nonlinear programming algorithms

Carathéodory’s lemma states that if we have a linear combination of vectors in R^n, we can rewrite this combination using a linearly independent subset. This result has been successfully applied in nonlinear optimization in many contexts. In this work we present a new version of this celebrated theorem, in which we obtained new bounds for … Read more

On sequential optimality conditions for smooth constrained optimization

Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Approximate KKT and Approximate Gradient Projection conditions are analyzed in this work. These conditions are not necessarily equivalent. Implications between different conditions and counter-examples will be shown. Algorithmic consequences will be discussed. Article Download View On sequential optimality conditions for … Read more