The augmented Lagrangian method (ALM) is fundamental for solving convex programming problems with linear constraints. The proximal version of ALM, which regularizes ALM's subproblem over the primal variable at each iteration by an additional positive-definite quadratic proximal term, has been well studied in the literature. In this paper, we show that it is not necessary to employ a positive-definite quadratic proximal term for the proximal ALM and the convergence can be still ensured if the positive-definiteness is relaxed to positive-indefiniteness by reducing the proximal parameter. The positive-indefinite proximal ALM is thus proposed for the generic setting of convex programming problems with linear constraints. We show that our relaxation is optimal in sense of that the proximal parameter cannot be further reduced. The consideration of positive-indefinite proximal regularization is particularly meaningful for generating larger step sizes for solving the primal subproblems of ALM. When the model under discussion is separable in sense of that its objective function consists of finitely many additive function components without coupled variables, it is desired to decompose each ALM's primal subproblem in Jacobian manner, replacing the original primal subproblem by a sequence of easier and smaller decomposed subproblems, so that parallel computation can be applied. This full Jacobian splitting version of ALM is known to be not necessarily convergent and it has been studied in the literature that its convergence can be ensured if all the decomposed subproblems are further regularized by sufficiently large proximal terms. But how small the proximal parameter could be is still open. The other purpose of this paper is to show the smallest proximal parameter for the full Jacobian splitting version of ALM for solving multi-block separable convex minimization models.