For a type of multi-block separable convex programming raised in machine learning and statistical inference, we propose a proximal alternating direction method of multiplier (PADMM) with partially parallel splitting, which has the following nice properties: (1) To alleviate the weight of the proximal terms, the restrictions imposed on the proximal parameters are relaxed substantively; (2) To maintain the inherent structure of the primal variables xi(i = 1; 2; : : : ;m), the relaxation parameter is only attached to the update formula of the dual variable . For the resulted method, we establish its global convergence and worst-case O(1=t) convergence rate in an ergodic sense, where t is the iteration counter. Finally, two numerical examples are given to illustrate the theoretical results obtained.