Two new weak constraint qualifications and applications

We present two new constraint qualifications (CQ) that are weaker than the recently introduced Relaxed Constant Positive Linear Depen- dence (RCPLD) constraint qualification. RCPLD is based on the assump- tion that many subsets of the gradients of the active constraints preserve positive linear dependence locally. A major open question was to identify the exact set … Read more

Optimal Newton-type methods for nonconvex smooth optimization problems

We consider a general class of second-order iterations for unconstrained optimization that includes regularization and trust-region variants of Newton’s method. For each method in this class, we exhibit a smooth, bounded-below objective function, whose gradient is globally Lipschitz continuous within an open convex set containing any iterates encountered and whose Hessian is $\alpha-$Holder continuous (for … Read more

Global Convergence of Radial Basis Function Trust Region Derivative-Free Algorithms

We analyze globally convergent derivative-free trust region algorithms relying on radial basis function interpolation models. Our results extend the recent work of Conn, Scheinberg, and Vicente to fully linear models that have a nonlinear term. We characterize the types of radial basis functions that fit in our analysis and thus show global convergence to first-order … Read more

A Penalty-Interior-Point Algorithm for Nonlinear Constrained Optimization

Penalty and interior-point methods for nonlinear optimization problems have enjoyed great successes for decades. Penalty methods have proved to be effective for a variety of problem classes due to their regularization effects on the constraints. They have also been shown to allow for rapid infeasibility detection. Interior-point methods have become the workhorse in large-scale optimization … Read more

Metal Artefact Reduction by Least-Squares Penalized-Likelihood Reconstruction with a Fast Polychromatic Projection Model

We consider penalized-likelihood reconstruction for X-ray computed tomography of objects that contain small metal structures. To reduce the beam hardening artefacts induced by these structures, we derive the reconstruction algorithm from a projection model that takes into account the photon emission spectrum and nonlinear variation of attenuation to photon energy. This algorithm requires excessively long … Read more

Using approximate secant equations in limited memory methods for multilevel unconstrained optimization

The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behaviour of the objective function. Following earlier work by Gratton and Toint (2009), we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take … Read more

Nonmonotone Filter Method for Nonlinear Optimization

We propose a new nonmonotone filter method to promote global and fast local convergence for sequential quadratic programming algorithms. Our method uses two filters: a global g-filter for global convergence, and a local nonmonotone l-filter that allows us to establish fast local convergence. We show how to switch between the two filters efficiently, and we … Read more

Stopping rules and backward error analysis for bound-constrained optimization

Termination criteria for the iterative solution of bound-constrained optimization problems are examined in the light of backward error analysis. It is shown that the problem of determining a suitable perturbation on the problem’s data corresponding to the definition of the backward error is analytically solvable under mild assumptions. Moreover, a link between existing termination criteria … Read more

Algorithm 909: NOMAD: Nonlinear Optimization with the MADS algorithm

NOMAD is software that implements the MADS algorithm (Mesh Adaptive Direct Search) for black-box optimization under general nonlinear constraints. Blackbox optimization is about optimizing functions that are usually given as costly programs with no derivative information and no function values returned for a significant number of calls attempted. NOMAD is designed for such problems and … Read more

Real-Time Optimization as a Generalized Equation

We establish results for the problem of tracking a time-dependent manifold arising in online nonlinear programming by casting this as a generalized equation. We demonstrate that if points along a solution manifold are consistently strongly regular, it is possible to track the manifold approximately by solving a linear complementarity problem (LCP) at each time step. … Read more