In 1988, Barzilai and Borwein presented a new choice of step size for the gradient method for solving unconstrained minimization problems. Their method aimed to accelerate the convergence of the steepest descent method. The Barzilai-Borwein method requires few storage locations and inexpensive computations. Therefore, several authors have paid attention to the Barzilai-Borwein method and have proposed some variants to solve large-scale unconstrained minimization problems. In this paper, we extend the Barzilai-Borwein method and establish global and Q-superlinear convergence properties of the proposed method for minimizing a strictly convex quadratic function. Furthermore, we discuss an application of our method to general objective functions. Finally, some numerical experiments are given.