Jordan Algebras are an important tool for dealing with semidefinite programming and optimization over symmetric cones in general. In this paper, a judicious use of Jordan Algebras in the context of sequential optimality conditions is done in order to generalize the global convergence theory of an Augmented Lagrangian method for nonlinear semidefinite programming. An approximate complementarity measure in this context is typically defined in terms of the eigenvalues of the constraint matrix and the eigenvalues of an approximate Lagrange multiplier. By exploiting the Jordan Algebra structure of the problem, we show that a simpler complementarity measure, defined in terms of the Jordan product, is stronger than the one defined in terms of eigenvalues. Thus, besides avoiding a tricky analysis of eigenvalues, a stronger necessary optimality condition is presented. We then prove the global convergence of an Augmented Lagrangian algorithm to this improved necessary optimality condition. The optimality conditions we present are sequential ones, and no constraint qualification is employed; in particular, a global convergence result is available even when Lagrange multipliers are unbounded.