In this paper, we propose a new primal-dual algorithm for minimizing f(x)+g(x)+h(Ax), where f, g, and h are convex functions, f is differentiable with a Lipschitz continuous gradient, and A is a bounded linear operator. It has some famous primal-dual algorithms for minimizing the sum of two functions as special cases. For example, it reduces to the Chambolle-Pock algorithm when f=0 and a primal-dual fixed-point algorithm in [P. Chen, J. Huang, and X. Zhang, A primal-dual fixed-point algorithm for convex separable minimization with applications to image restoration, Inverse Problems, 29 (2013), p.025011] when g=0. In addition, it recovers the three-operator splitting scheme in [D. Davis and W. Yin, A three-operator splitting scheme and its optimization applications, arXiv:1504.01032, (2015)] when A is the identity operator. We prove the convergence of this new algorithm for the general case by showing that the iteration is a nonexpansive operator and derive the linear convergence rate with additional assumptions. Comparing to other primal-dual algorithms for solving the same problem, this algorithm extends the range of acceptable parameters to ensure the convergence and has a smaller per-iteration cost. The numerical experiments show the efficiency of this new algorithm by comparing to other primal-dual algorithms.