This paper uses certain conditions for the global and superlinear convergence of the two-parameter self-scaling Broyden family of quasi-Newton algorithms for unconstraiend optimization to derive a wide interval for self-scaling updates. Numerical testing shows that such algorithms not only accelerate the convergence of the (unscaled) methods from the so-called convex class, but increase their chances of success as well. Self-scaling updates from the preconvex and the postconvex classes are shown to be effective in practice, and new algorithms which work well in practice, with or without scaling, are also obtained from the new interval. Unlike the behaviour of unscaled methods, numerical testing shows that varying the updating parameter in the proposed interval has little effect on the performance of the self-scaling algorithms.
DOMAS 03/1, Sulatn Qaboos University, Oman, July, 2003.
View A Wide Interval for Efficient Self-Scaling Quasi-Newton Algorithms