We investigate the integration of Nesterov-type acceleration into primal-dual methods for structured convex optimization. While proximal splitting algorithms efficiently handle composite problems of the form min_x f(x) + g(x) + h(Kx), accelerating their convergence with respect to the smooth term f is notoriously challenging due to the rotational dynamics in the primal-dual space. In this paper, we overcome this barrier by proposing the Accelerated Proximal Alternating Predictor-Corrector algorithm (APAPC), focusing on the setting where g(x) = (mu_g/2)||x||^2. Our analysis reveals that Nesterov momentum can be seamlessly integrated into a primal-dual forward-backward scheme by exploiting the strong convexity of the dual problem to stabilize the accelerated primal updates. Using a unified Lyapunov framework, we establish optimal O(1/t^2) sublinear convergence rates, as well as accelerated linear convergence when mu_g > 0, across three regimes of dual strong convexity: (i) when h is smooth, (ii) when the linear operator K^* is bounded below, and (iii) for linearly constrained optimization. Furthermore, leveraging recent results on accelerated gradient descent, we characterize the weak convergence of the primal-dual iterates to a saddle-point solution.