In this paper, a reliable hybrid algorithm for solving convex quadratic minimization problems is presented. At each iteration, two points are computed: first, an auxiliary point $\dot{x}_k$ is generated by performing a gradient step equipped with an optimal steplength, then, the next iterate $x_{k+1}$ is obtained through a weighted sum of $\dot{x}_k$ with the penultimate iterate $x_{k-1}$. The coefficient of the linear combination is computed by minimizing the residual norm along the line determined by the previous points. In particular, we adopt an optimal, non-delayed steplength in the first step and then we use a smoothing technique to impose a delay on the scheme. Under a weak assumption, we show that our algorithm is Q-linearly convergent to the unique solution of the problem. Finally, we report numerical experiments on strictly convex quadratic problems, showing that the proposed method outperforms, in efficiency, a wide collection of state-of-the art Barzilai-Borwein like gradient methods, while being competitive in terms of CPU time and iterations with the conjugate gradient method.