Adaptive Newton-CG methods with global and local analysis for unconstrained optimization with Hölder continuous Hessian

In this paper, we study Newton-conjugate gradient (Newton-CG) methods for minimizing a nonconvex function $f$ whose Hessian is $(H_f,\nu)$-H\”older continuous with modulus $H_f>0$ and exponent \(\nu\in(0,1]\). Recently proposed Newton-CG methods for this problem \cite{he2025newton} adopt (i) non-adaptive regularization and (ii) a nested line-search procedure, where (i) often leads to inefficient early progress and the loss … Read more

Complexity of normalized stochastic first-order methods with momentum under heavy-tailed noise

In this paper, we propose practical normalized stochastic first-order methods with Polyak momentum, multi-extrapolated momentum, and recursive momentum for solving unconstrained optimization problems. These methods employ dynamically updated algorithmic parameters and do not require explicit knowledge of problem-dependent quantities such as the Lipschitz constant or noise bound. We establish first-order oracle complexity results for finding … Read more

Stochastic first-order methods with multi-extrapolated momentum for highly smooth unconstrained optimization

In this paper we consider an unconstrained stochastic optimization problem where the objective function exhibits a high order of smoothness. In particular, we propose a stochastic first-order method (SFOM) with multi-extrapolated momentum, in which multiple extrapolations are performed in each iteration, followed by a momentum step based on these extrapolations. We show that our proposed … Read more