In this paper, we propose algorithms that exploit negative curvature for solving noisy nonlinear nonconvex unconstrained optimization problems. We consider both deterministic and stochastic inexact settings, and develop two-step algorithms that combine directions of negative curvature and descent directions to update the iterates. Under reasonable assumptions, we prove second-order convergence results and derive complexity guarantees for both settings. To tackle large-scale problems, we develop a practical variant that utilizes the conjugate gradient method with negative curvature detection and early stopping to compute a step, a simple adaptive step size scheme, and a strategy for selecting the sample sizes of the gradient and Hessian approximations as the optimization progresses. Numerical results on two machine learning problems showcase the efficacy and efficiency of the practical method.