Convex optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, proximal splitting algorithms are particularly adequate: they consist of simple operations, by handling the terms in the objective function separately. We present several existing proximal splitting algorithms and we derive new ones, within a unified framework, which consists in applying splitting methods for monotone inclusions, like the forward-backward algorithm, in primal-dual product spaces with well-chosen metric. This allows us to derive new convergence theorems with larger parameter ranges. In particular, when the smooth term in the objective function is quadratic, e.g. for least-squares problems, convergence is guaranteed with larger values of the relaxation parameter than previously known. Indeed, it is often the case in practice that the larger the relaxation parameter, the faster the convergence.