This article introduces a new retraction on the symplectic Stiefel manifold. The operation that requires the highest computational cost to compute the novel retraction is a matrix inversion of size $2p$–by–$2p$, which is much less expensive than those required for the available retractions in the literature. Later, with the new retraction, we design a constraint preserving gradient method to minimize smooth functions defined on the symplectic Stiefel manifold. In order to improve the numerical performance of our approach, we use the non–monotone line–search of Zhang and Hager with an adaptive Barzilai–Borwein type step–size. Our numerical studies show that the proposed procedure is computationally promising and is a very good alternative to solve large–scale optimization problems over the symplectic Stiefel manifold.