We study the statistical consistency of distributionally robust optimization (DRO) with metric-based ambiguity sets. While convergence of optimal values is well understood, a unified set-valued analysis of feasible regions and solution sets remains largely missing, especially for constrained DRO. We develop a general variational framework based on a collapse principle, which requires that all probability measures in the ambiguity sets converge weakly to the true distribution. Under this condition, we establish uniform convergence of robust objective and constraint functionals, and combine it with Painlev\’e–Kuratowski(PK) set convergence to derive consistency of optimal values and solution sets. In particular, we prove upper convergence of minimizers and convergence of feasible regions, providing a full set-valued characterization of statistical consistency. The framework applies to general metric-based ambiguity sets and extends naturally to constrained and multi-constraint DRO problems. All results hold almost surely.