A New Sequential Optimality Condition for Constrained Nonsmooth Optimization

We introduce a sequential optimality condition for locally Lipschitz constrained nonsmooth optimization, verifiable just using derivative information, and which holds even in the absence of any constraint qualification. The proposed sequential optimality condition is not only novel for nonsmooth problems, but brings new insights for the smooth case as well. We present a practical algorithm … Read more

Local convergence analysis of the Levenberg-Marquardt framework for nonzero-residue nonlinear least-squares problems under an error bound condition

The Levenberg-Marquardt method (LM) is widely used for solving nonlinear systems of equations, as well as nonlinear least-squares prob- lems. In this paper, we consider local convergence issues of the LM method when applied to nonzero-residue nonlinear least-squares problems under an error bound condition, which is weaker than requiring full-rank of the Jacobian in a … Read more

Accelerating block coordinate descent methods with identification strategies

This work is about active set identification strategies aimed at accelerating block-coordinate descent methods (BCDM) applied to large-scale problems. We start by devising an identification function tailored for bound-constrained composite minimization together with an associated version of the BCDM, called Active BCDM, that is also globally convergent. The identification function gives rise to an efficient … Read more

On the use of third-order models with fourth-order regularization for unconstrained optimization

In a recent paper, it was shown that, for the smooth unconstrained optimization problem, worst-case evaluation complexity $O(\epsilon^{-(p+1)/p})$ may be obtained by means of algorithms that employ sequential approximate minimizations of p-th order Taylor models plus (p + 1)-th order regularization terms. The aforementioned result, which assumes Lipschitz continuity of the p-th partial derivatives, generalizes … Read more

A pattern search and implicit filtering algorithm for solving linearly constrained minimization problems with noisy objective functions

PSIFA -Pattern Search and Implicit Filtering Algorithm- is a derivative-free algorithm that has been designed for linearly constrained problems with noise in the objective function. It combines some elements of the pattern search approach of Lewis and Torczon (2000) with ideas from the method of implicit filtering of Kelley (2011) enhanced with a further analysis … Read more

Bilevel optimization with a multiobjective problem in the lower level

Bilevel problems model instances with a hierarchical structure. Aiming at an efficient solution of a constrained multiobjective problem according with some pre-defined criterion, we reformulate this optimization but non standard problem as a classic bilevel one. This reformulation intents to encompass all the objectives, so that the properly efficient solution set is recovered by means … Read more

On the local convergence analysis of the Gradient Sampling method

The Gradient Sampling method is a recently developed tool for solving unconstrained nonsmooth optimization problems. Using just first order information about the objective function, it generalizes the steepest descent method, one of the most classical methods to minimize a smooth function. This manuscript aims at determining under which circumstances one can expect the same local … Read more

A Second-Order Information-Based Gradient and Function Sampling Method for Nonconvex, Nonsmooth Optimization

This paper has the goal to propose a gradient and function sampling method that under special circumstances moves superlinearly to a minimizer of a general class of nonsmooth and nonconvex functions. We present global and local convergence theory with illustrative examples that corroborate and elucidate the theoretical results obtained along the manuscript. ArticleDownload View PDF

Evaluation complexity for nonlinear constrained optimization using unscaled KKT conditions and high-order models

The evaluation complexity of general nonlinear, possibly nonconvex,constrained optimization is analyzed. It is shown that, under suitable smoothness conditions, an $\epsilon$-approximate first-order critical point of the problem can be computed in order $O(\epsilon^{1-2(p+1)/p})$ evaluations of the problem’s function and their first $p$ derivatives. This is achieved by using a two-phases algorithm inspired by Cartis, Gould, … Read more

Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models

The worst-case evaluation complexity for smooth (possibly nonconvex) unconstrained optimization is considered. It is shown that, if one is willing to use derivatives of the objective function up to order $p$ (for $p\geq 1$) and to assume Lipschitz continuity of the $p$-th derivative, then an $\epsilon$-approximate first-order critical point can be computed in at most … Read more