The Nesterov’s accelerated gradient (NAG) method generalizes the classical gradient descent algorithm by improving the convergence rate from $\mathcal{O}\left(\frac{1}{t}\right)$ to $\mathcal{O}\left(\frac{1}{t^2}\right)$ in convex optimization. This study examines the proximal gradient framework for additively separable composite functions with smooth and non-smooth components. We demonstrate that Nesterov’s accelerated proximal gradient (NAPG$_\alpha$) method attains a convergence rate of $o\left(\frac{1}{t^2}\right)$ when $\alpha>3$ for strong-weak convex functions. We also present a Lyapunov analysis to establish the rapid convergence of the composite gradient operator when the smooth component is strongly convex and the non-smooth component is weakly convex. Also, we establish the equivalence between Nesterov’s accelerated proximal gradient and Ravine accelerated proximal gradient method.