In this paper, based on the regularization techniques and projected gradient strategies, we present a quadratic regularization projected alternating Barzilai–Borwein (QRPABB) method for minimizing differentiable functions on closed convex sets. We show the convergence of the QRPABB method to a constrained stationary point for a nonmonotone line search. When the objective function is convex, we prove the error in the objective function at iteration $k$ is bounded by $a/(k+1)$ for some $a$ independent of $k$. Moreover, if the objective function is strongly convex, then the convergence rate is $R$-linear. Numerical comparisons of methods on box-constrained quadratic problems and nonnegative matrix factorization problems show that the QRPABB method is promising.