Bilevel Derivative-Free Optimization and its Application to Robust Optimization

We address bilevel programming problems when the derivatives of both the upper and the lower level objective functions are unavailable. The core algorithms used for both levels are trust-region interpolation-based methods, using minimum Frobenius norm quadratic models when the number of points is smaller than the number of basis components. We take advantage of the … Read more

On affine scaling inexact dogleg methods for bound-constrained nonlinear systems

A class of trust-region methods for large scale bound-constrained systems of nonlinear equations is presented. The methods in this class follow the so called affine-scaling approach and can efficiently handle large scale problems. At each iteration, a suitably scaled region around the current approximate solution is defined and, within such a region, the norm of … Read more

Constrained Dogleg Methods for nonlinear systems with simple bounds

We focus on the numerical solution of medium scale bound-constrained systems of nonlinear equations. In this context, we consider an affine-scaling trust region approach that allows a great flexibility in choosing the scaling matrix used to handle the bounds. The method is based on a dogleg procedure tailored for constrained problems and so, it is … Read more

A globally convergent modified conjugate-gradient line-search algorithm with inertia controlling

In this paper we have addressed the problem of unboundedness in the search direction when the Hessian is indefinite or near singular. A new algorithm has been proposed which naturally handles singular Hessian matrices, and is theoretically equivalent to the trust-region approach. This is accomplished by performing explicit matrix modifications adaptively that mimic the implicit … Read more

A Derivative-Free Algorithm for the Least-square minimization

We develop a framework for a class of derivative-free algorithms for the least-squares minimization problem. These algorithms are based on polynomial interpolation models and are designed to take advantages of the problem structure. Under suitable conditions, we establish the global convergence and local quadratic convergence properties of these algorithms. Promising numerical results indicate the algorithm … Read more

A practical method for solving large-scale TRS

We present a nearly-exact method for the large scale trust region subproblem (TRS) based on the properties of the minimal-memory BFGS method. Our study in concentrated in the case where the initial BFGS matrix can be any scaled identity matrix. The proposed method is a variant of the Mor\'{e}-Sorensen method that exploits the eigenstructure of … Read more

On solving trust-region and other regularised subproblems in optimization

The solution of trust-region and regularisation subproblems which arise in unconstrained optimization is considered. Building on the pioneering work of Gay, More’ and Sorensen, methods which obtain the solution of a sequence of parametrized linear systems by factorization are used. Enhancements using high-order polynomial approximation and inverse iteration ensure that the resulting method is both … Read more

An Interior-Point Algorithm for Large-Scale Nonlinear Optimization with Inexact Step Computations

We present a line-search algorithm for large-scale continuous optimization. The algorithm is matrix-free in that it does not require the factorization of derivative matrices. Instead, it uses iterative linear system solvers. Inexact step computations are supported in order to save computational expense during each iteration. The algorithm is an interior-point approach derived from an inexact … Read more

ORBIT: Optimization by Radial Basis Function Interpolation in Trust-Regions

We present a new derivative-free algorithm, ORBIT, for unconstrained local optimization of computationally expensive functions. A trust-region framework using interpolating Radial Basis Function (RBF) models is employed. The RBF models considered often allow ORBIT to interpolate nonlinear functions using fewer function evaluations than the polynomial models considered by present techniques. Approximation guarantees are obtained by … Read more

A Matrix-free Algorithm for Equality Constrained Optimization Problems with Rank-deficient Jacobians

We present a line search algorithm for large-scale constrained optimization that is robust and efficient even for problems with (nearly) rank-deficient Jacobian matrices. The method is matrix-free (i.e., it does not require explicit storage or factorizations of derivative matrices), allows for inexact step computations, and is applicable for nonconvex problems. The main components of the … Read more