Response surface methods show promising results for global optimization of costly non convex objective functions, i.e. the problem of finding the global minimum when there are several local minima and each function value takes considerable CPU time to compute. Such problems often arise in industrial and financial applications, where a function value could be a result of a time-consuming computer simulation or optimization. Derivatives are most often hard to obtain. The problem is here extended with linear and nonlinear constraints, and the nonlinear constraints can be costly or not. A new algorithm that handles the constraints, based on radial basis functions (RBF), and that preserves the convergence proof of the original RBF algorithm is presented. The algorithm takes advantage of the optimization algorithms in the Tomlab optimization environment (www.tomlab.biz). Numerical results are presented for standard test problems.
Department of Mathematics and Physics Mälardalen University P.O. Box 883 SE-721 23 Västerås, Sweden Research Report MdH,IMa - 2004