This paper develops negative curvature methods for continuous nonlinear unconstrained optimization in stochastic settings, in which function, gradient, and Hessian information is available only through probabilistic oracles, i.e., oracles that return approximations of a certain accuracy and reliability. We introduce conditions on these oracles and design a two-step framework that systematically combines gradient and negative curvature steps. The framework employs an early-stopping mechanism to guarantee sufficient progress and uses an adaptive mechanism based on an Armijo-type criterion to select the step sizes for both steps. We establish high-probability iteration-complexity guarantees for attaining second-order stationary points, deriving explicit tail bounds that quantify the convergence neighborhood and its dependence on oracle noise. Importantly, these bounds match deterministic rates up to noise-dependent terms, and the framework recovers the deterministic results as a special case. Finally, numerical experiments demonstrate the practical benefits of exploiting negative curvature directions even in the presence of noise.