Do You Trust Derivatives or Differences?

We analyze the relationship between the noise level of a function and the accuracy and reliability of derivatives and difference estimates. We derive and empirically validate measures of quality for both derivatives and difference estimates. Using these measures, we quantify the accuracy of derivatives and differences in terms of the noise level of the function. … Read more

Simulation Optimization for the Stochastic Economic Lot Scheduling Problem

We study simulation optimization methods for the stochastic economic lot scheduling problem. In contrast to prior research, we focus on methods that treat this problem as a black box. Based on a large-scale numerical study, we compare approximate dynamic programming with a global search for parameters of simple control policies. We propose two value function … Read more

Global optimization of expensive black box problems with a known lower bound

In this paper we propose an algorithm for the global optimization of computationally expensive black–box functions. For this class of problems, no information, like e.g. the gradient, can be obtained and function evaluation is highly expensive. In many applications, however, a lower bound on the objective function is known; in this situation we derive a … Read more

A surrogate management framework using rigorous trust-regions steps

Surrogate models and heuristics are frequently used in the optimization engineering community as convenient approaches to deal with functions for which evaluations are expensive or noisy, or lack convexity. These methodologies do not typically guarantee any type of convergence under reasonable assumptions and frequently render slow convergence. In this paper we will show how to … Read more

A surrogate management framework using rigorous trust-regions steps

Surrogate models and heuristics are frequently used in the optimization engineering community as convenient approaches to deal with functions for which evaluations are expensive or noisy, or lack convexity. These methodologies do not typically guarantee any type of convergence under reasonable assumptions and frequently render slow convergence. In this paper we will show how to … Read more

Optimal adaptive control of cascading power grid failures

We describe experiments with parallel algorithms for computing adaptive controls for attenuating power grid cascading failures. Citation Columbia University, 2010 Article Download View Optimal adaptive control of cascading power grid failures

Preconditioning and Globalizing Conjugate Gradients in Dual Space for Quadratically Penalized Nonlinear-Least Squares Problems

When solving nonlinear least-squares problems, it is often useful to regularize the problem using a quadratic term, a practice which is especially common in applications arising in inverse calculations. A solution method derived from a trust-region Gauss-Newton algorithm is analyzed for such applications, where, contrary to the standard algorithm, the least-squares subproblem solved at each … Read more

Self-correcting geometry in model-based algorithms for derivative-free unconstrained optimization

Several efficient methods for derivative-free optimization (DFO) are based on the construction and maintenance of an interpolation model for the objective function. Most of these algorithms use special “geometry-improving” iterations, where the geometry (poisedness) of the underlying interpolation set is made better at the cost of one or more function evaluations. We show that such … Read more

ORBIT: Optimization by Radial Basis Function Interpolation in Trust-Regions

We present a new derivative-free algorithm, ORBIT, for unconstrained local optimization of computationally expensive functions. A trust-region framework using interpolating Radial Basis Function (RBF) models is employed. The RBF models considered often allow ORBIT to interpolate nonlinear functions using fewer function evaluations than the polynomial models considered by present techniques. Approximation guarantees are obtained by … Read more

Robust Nonconvex Optimization for Simulation-based Problems

In engineering design, an optimized solution often turns out to be suboptimal, when implementation errors are encountered. While the theory of robust convex optimization has taken significant strides over the past decade, all approaches fail if the underlying cost function is not explicitly given; it is even worse if the cost function is nonconvex. In … Read more