Adaptive Newton-CG methods with global and local analysis for unconstrained optimization with Hölder continuous Hessian

In this paper, we study Newton-conjugate gradient (Newton-CG) methods for minimizing a nonconvex function $f$ whose Hessian is $(H_f,\nu)$-H\”older continuous with modulus $H_f>0$ and exponent \(\nu\in(0,1]\). Recently proposed Newton-CG methods for this problem \cite{he2025newton} adopt (i) non-adaptive regularization and (ii) a nested line-search procedure, where (i) often leads to inefficient early progress and the loss of local superlinear convergence, and (ii) may incur high computational cost due to multiple solves of the Newton system per iteration. To address these limitations, we propose two novel Newton-CG algorithms, depending on the availability of $\nu$, that adaptively regularize the Newton system by leveraging the auto-conditioning technique to eliminate the nested line search. The proposed algorithms achieve the best-known iteration complexity ${\mathcal O}\big(H_f^{1/(1+\nu)}\epsilon^{-(2+\nu)/(1+\nu)}\big)$ for finding an $\epsilon$-stationary point and, simultaneously, enjoy local superlinear convergence near nondegenerate local minimizers. Numerical experiments further demonstrate the practical advantages of our algorithms over existing approaches.

Article

Download

View PDF