When pursuing an approximate second-order stationary point in nonconvex constrained stochastic optimization, is it possible to design a stochastic second-order method that achieves the same sample complexity order as in the unconstrained setting? To address this question in this paper, we first introduce Carme, a curvature-oriented variance reduction method designed for unconstrained nonconvex stochastic optimization. Under the smoothness assumption of the stochastic objective function, the sample complexity of Carme, regarding evaluations of first- and second-order oracles, improves the best-known results in the literature. We then propose Carme-ALM, an augmented Lagrangian-based variant of Carme tailored to nonconvex stochastic optimization with deterministic constraints. Under suitable conditions, we prove that Carme-ALM achieves a sample complexity for finding an approximate second-order stationary point that is comparable to that of the unconstrained case. This provides a positive, yet conditional, answer to the question posed above.