We present a structured quasi-Newton algorithm for unconstrained optimization problems that have unavailable second-order derivatives or Hessian terms. We provide a formal derivation of the well-known BFGS secant update formula that approximates only the missing Hessian terms, and we propose a line-search quasi-Newton algorithm based on a modification of Wolfe conditions that converges to first-order optimality conditions. We also analyze the local convergence properties of the structured BFGS algorithm and show that it achieves superlinear convergence under the standard assumptions used by quasi-Newton methods using secant updates. We provide a thorough study of the practical performance of the algorithm on the CUTEr suite of test problems and show that our structured BFGS-based quasi-Newton algorithm outperforms the unstructured counterpart(s).
Citation
Lawrence Livermore National Laboratory, Report LLNL-JRNL-745068, Jan 2018.
Article
View A structured quasi-Newton algorithm for optimizing with incomplete Hessian information