In this work, we investigate the application of Davis–Yin splitting (DYS) to convex optimization problems and demonstrate that swapping the roles of the two nonsmooth convex functions can result in a faster convergence rate. Such a swap typically yields a different sequence of iterates, but its impact on convergence behavior has been largely understudied or often overlooked. We address this gap by establishing best-known convergence rates for DYS and its swapped counterpart, using the primal–dual gap function as the performance metric. Our results indicate that variants of the Douglas–Rachford splitting algorithm (a special case of DYS) share the same worst-case rate, whereas the convergence rates of the two DYS variants differ. This discrepancy is further illustrated through concrete examples.