We propose a convergence analysis of accelerated forward-backward splitting methods for minimizing composite functions, when the proximity operator is not available in closed form, and is thus computed up to a certain precision. We prove that the $1/k^2$ convergence rate for the function values can be achieved if the admissible errors are of a certain type and satisfy a sufficiently fast decay condition. Our analysis is based on the machinery of estimate sequences first introduced by Nesterov for the study of accelerated gradient descent algorithms. An experimental analysis aiming at validating the obtained rates is also presented.