Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods

This paper presents a finite difference quasi-Newton method for the minimization of noisy functions. The method takes advantage of the scalability and power of BFGS updating, and employs an adaptive procedure for choosing the differencing interval h based on the noise estimation techniques of Hamming (2012) and Moré and Wild (2011). This noise estimation procedure … Read more

A Dynamic Penalty Parameter Updating Strategy for Matrix-Free Sequential Quadratic Optimization

This paper focuses on the design of sequential quadratic optimization (commonly known as SQP) methods for solving large-scale nonlinear optimization problems. The most computationally demanding aspect of such an approach is the computation of the search direction during each iteration, for which we consider the use of matrix-free methods. In particular, we develop a method … Read more

On the Complexity of Testing Attainment of the Optimal Value in Nonlinear Optimization

We prove that unless P=NP, there exists no polynomial time (or even pseudo-polynomial time) algorithm that can test whether the optimal value of a nonlinear optimization problem where the objective and constraints are given by low-degree polynomials is attained. If the degrees of these polynomials are fixed, our results along with previously-known “Frank-Wolfe type” theorems … Read more

Solving Quadratic Programs to High Precision using Scaled Iterative Refinement

Quadratic optimization problems (QPs) are ubiquitous, and solution algorithms have matured to a reliable technology. However, the precision of solutions is usually limited due to the underlying floating-point operations. This may cause inconveniences when solutions are used for rigorous reasoning. We contribute on three levels to overcome this issue. First, we present a novel refinement … Read more

A Globally Asymptotically Stable Polynomial Vector Field with Rational Coefficients and no Local Polynomial Lyapunov Function

We give an explicit example of a two-dimensional polynomial vector field of degree seven that has rational coefficients, is globally asymptotically stable, but does not admit an analytic Lyapunov function even locally. CitationSubmitted for publicationArticleDownload View PDF

The Mesh Adaptive Direct Search Algorithm for Granular and Discrete Variables

The mesh adaptive direct search (Mads) algorithm is designed for blackbox optimization problems for which the functions defining the objective and the constraints are typically the outputs of a simulation seen as a blackbox. It is a derivative-free optimization method designed for continuous variables and is supported by a convergence analysis based on the Clarke … Read more

Optimality conditions and global convergence for nonlinear semidefinite programming

Sequential optimality conditions have played a major role in unifying and extending global convergence results for several classes of algorithms for general nonlinear optimization. In this paper, we extend theses concepts for nonlinear semidefinite programming. We define two sequential optimality conditions for nonlinear semidefinite programming. The first is a natural extension of the so-called Approximate-Karush-Kuhn-Tucker … Read more

MIQP-Based Algorithm for the Global Solution of Economic Dispatch Problems with Valve-Point Effects

Even in a static setting, the economic load dispatch problem (ELDP)—namely the cost-optimal distribution of power among generating units to meet a specific demand subject to system constraints—turns out to be a challenge owing to the consideration of valve-point effects (VPE), which make the cost function nonsmooth and nonconvex. We present a new method, termed … Read more

Superiorization and perturbation resilience of algorithms: A continuously updated bibliography

This document presents a, chronologically ordered, bibliography of scientific publications on the superiorization methodology and perturbation resilience of algorithms which is compiled and continuously updated by us at: http://math.haifa.ac.il/yair/bib-superiorization-censor.html. Since the topic is relatively new it is possible to trace the work that has been published about it since its inception. To the best of … Read more

A comparison of methods for traversing non-convex regions in optimization problems

This paper considers again the well-known problem of dealing with non-convex regions during the minimization of a nonlinear function F(x) by Newton-like methods. The proposal made here involves a curvilinear search along an approximation to the continuous steepest descent path defined by the solution of the ODE dx/dt = -grad F(x). The algorithm we develop … Read more