Some highly successful algorithms for unconstrained minimization without derivatives construct changes to the variables by applying trust region methods to quadratic approximations to the objective function F(x), x in R^n. A quadratic model has (n+1)(n+2)/2 independent parameters, but each new model may interpolate only 2n+1 values of F, for instance. The symmetric Broyden method takes up the remaining freedom by minimizing the Frobenius norm of the difference between the second derivative matrices of the old and new models, which usually works well in practice. We consider an extension of this technique that combines changes in first derivatives with changes in second derivatives. A simple example suggests that the extension does bring some advantages, but numerical experiments on three test problems with up to 320 variables are disappointing. On the other hand, rates of convergence are investigated numerically when F is a homogeneous quadratic function, which allows very high accuracy to be achieved in practice, the initial and final errors in the variables being about 10 and 10^{-5000}, respectively. It is clear in some of these experiments that the extension does reduce the number of iterations. The main difficulty in the work was finding a way of implementing the extension sufficiently accurately in only of order n^2 operations on each iteration. A version of the truncated conjugate gradient procedure is suitable, that is used in the numerical experiments, and that is described in detail in an appendix.
Citation
Report number DAMTP 2010/NA04, Centre for Mathematical Sciences, Wilberforce Road, Cambridge CB3 0WA, UK (March, 2010).
Article
View Beyond symmetric Broyden for updating quadratic models in minimization without derivatives