In unconstrained optimization, due to the nonlinearity of the objective function or rounding errors in finite precision arithmetic, it can happen that NaN or infinite step sizes appear in the nonlinear conjugate gradient (NCG) method, or otherwise the step violates the sufficient descent condition (SDC). In this case the conjugate gradient (CG) direction must often be restarted using a scaled steepest descent variant which can cause zigzagging. To address these shortcomings, we enhance the performance of the NCG method by replacing the CG direction with a new hybrid direction, one of whose two components reconstructs a suitable direction in regions of proximity, but not at a local minimizer. This new hybrid search direction is a combination of the steepest descent with a novel valley-seeking direction, designed for regions including points with low gradient norms. If the hybrid direction violates SDC in some iterations, it is adjusted in a manner to ensure SDC. Theoretical analysis confirms the convergence properties of the new class of nonlinear CG methods. Our numerical experiments show that traditional NCG methods can become more robust if they use the new hybrid direction in restart iterations.