In this paper, we present a version of the Cubic Regularization of Newton’s method for unconstrained nonconvex optimization, in which the Hessian matrices are approximated by forward finite difference Hessians. The regularization parameter of the cubic models and the accuracy of the Hessian approximations are jointly adjusted using a nonmonotone line-search criterion. Assuming that the Hessian of the objective function is globally Lipschitz continuous, we show that the proposed method needs at most $\mathcal{O}\left(n\epsilon^{-3/2}\right)$ function and gradient evaluations to generate an $\epsilon$-approximate stationary point, where $n$ is the dimension of the domain of the objective function. Preliminary numerical results corroborate our theoretical findings.