This paper studies the convergence of a Nesterov accelerated variant of the Broyden-Fletcher-Goldfarb-Shanno (NA-BFGS) quasi-Newton method in the setting where the objective function is strongly convex, its gradient is Lipschitz continuous, and its Hessian is Lipschitz continuous at the optimal point. We demonstrate that similar to the classic BFGS method, the Nesterov accelerated BFGS method also achieves a non-asymptotic superlinear convergence rate of the form $(\frac{1}{k})^{\frac{k}{4}}$ within a local neighbourhood of the optimal point. The work provides a theoretical explanation of the superlinear convergence of NA-BFGS non-asymptotically and explicitly.