Sub-sampled Trust-Region Methods with Deterministic Worst-Case Complexity Guarantees

In this paper, we develop and analyze sub-sampled trust-region methods for solving finite-sum optimization problems. These methods employ subsampling strategies to approximate the gradient and Hessian of the objective function, significantly reducing the overall computational cost. We propose a novel adaptive procedure for deterministically adjusting the sample size used for gradient (or gradient and Hessian) … Read more

Subsampled cubic regularization method for finite-sum minimization

This paper proposes and analyzes  a  subsampled Cubic Regularization Method  (CRM) for solving  finite-sum optimization problems.  The new method uses  random subsampling techniques  to approximate  the  functions, gradients and Hessians in order to reduce the overall computational cost of the CRM. Under suitable hypotheses,  first- and second-order  iteration-complexity bounds and global convergence analyses  are presented. … Read more