The limited-memory BFGS (L-BFGS) algorithm is a popular method of solving large-scale unconstrained minimization problems. Since L-BFGS conducts a line search with the Wolfe condition, it may require many function evaluations for ill-posed problems. To overcome this difficulty, we propose a method that combines L-BFGS with the regularized Newton method. The computational cost for a single iteration of the proposed method is the same as that of the original L-BFGS method. We show that the proposed method has global convergence under the usual conditions. Moreover, we present numerical results that show the robustness of the proposed method.
Citation
Technical report, Department of Applied Mathematics and Physics, Graduate School of Informatics, Kyoto University, Japan, 2014.