New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization

In this paper, two new subspace minimization conjugate gradient methods based on p-regularization models are proposed, where a special scaled norm in p-regularization model is analyzed. Diffierent choices for special scaled norm lead to different solutions to the p-regularized subproblem. Based on the analyses of the solutions in a two-dimensional subspace, we derive new directions satisfying the sufficient descent condition. With a modi ed nonmonotone line search, we establish the global convergence of the proposed methods under mild assumptions. R-linear convergence of the proposed methods are also analyzed. Numerical results show that, for the CUTEr library, the proposed methods are superior to four conjugate gradient methods, which were proposed by Hager and Zhang (SIAM J Optim 16(1):170-192, 2005), Dai and Kou (SIAM J Optim 23(1):296-320, 2013), Liu and Liu (J Optim Theory Appl 180(3):879-906, 2019) and Li et al. (Comput Appl Math 38(1): 2019), respectively.

Citation

1. School of Mathematics and Statistics, Xidian University, Xi'an 710126, China.2. State Key Laboratory of Scienti c and Engineering Computing, Institute of Computational Mathematics and Scien- ti c/Engineering computing, AMSS, Chinese Academy of Sciences, Beijing, 100190, China. 2020.04.03

Article

Download

View PDF