The paper answers several open questions of the alternating direction method of multipliers (ADMM) and the block coordinate descent (BCD) method that are now wildly used to solve large scale convex optimization problems in many fields. For ADMM, it is still lack of theoretical understanding of the algorithm when the objective function is not separable across the variables. In this paper, we analyze the convergence of the 2-block ADMM for solving the linearly constrained convex optimization with coupled quadratic objective, and show that the classical ADMM point-wisely converges to a primal-dual solution pair of this problem. Moreover, we propose to use the randomly permuted ADMM (RPADMM) to solve the nonseparable multi-block convex optimization, and prove its expected convergence while applied to solve a class of quadratic programming problems. When the linear constraint vanishes, the 2-block ADMM and the randomly permuted ADMM reduce to the 2-block cyclic BCD method (also known as the alternating minimization method) and the EPOCHS (the randomly permuted cyclic BCD method, also known as the “sampling without replacement" variant of randomized BCD. For simplicity, we call it EPOCHS as suggested in the survey paper of Stephen Wright). Interestingly, our study provides the first iterate convergence result of the 2-block cyclic BCD method without assuming the boundedness of the iterates. Under the same setting, the sublinear convergence rate of the function values can also be verified. Moreover, we also theoretically establish the expected iterate convergence result of the multi-block EPOCHS or convex quadratic optimization which can be regarded as one of the first iterate convergence analysis of EPOCHS. Last but not least, although random permutation is indeed to make multi-block ADMM and BCD more robust, we theoretically demonstrate that EPOCHS has a worse convergence rate than the cyclic BCD does in solving 2-block convex quadratic minimization problems. Therefore, EPOCHS should be applied in caution when solving general optimization problems.