A hybrid HS and PRP type conjugate gradient method for smooth optimization is presented, which reduces to the classical RPR or HS method if exact linear search is used and converges globally and R-linearly for nonconvex functions with an inexact backtracking line search under standard assumption. An inexact version of the proposed method which admits possible approximate gradient or/and approximate function values is also given. It is very important for such problems whose gradients or function values are not available or difficult to compute. The inexact version is proved to be globally convergent for general functions using some approximate descent line search. Moreover, the inexact method is applied to solve a nonsmooth convex optimization problem by converting it into a once continuously differentiable function by way of the Moreau-Yosida regularization.
Report, Changsha University of Science and Technology, 28/10/2011