This paper investigates non-convex stochastic compositional optimization under heavy-tailed noise, where the stochastic noise exhibits bounded $p$th moment with $p\in(1,2]$. The main challenges arise from biased gradient estimates of the objective and the violation of the standard bounded-variance assumption. To address these issues, we propose a generic algorithm framework of Normalized Stochastic Compositional Gradient methods (NSCG) and explore two specific variance-reduced methods within this framework: NSCG-M and NSCG-S. Considering both scenarios with and without prior knowledge of p, we analyze the sample complexity of NSCG-M under standard Lipschitz continuity and smoothness conditions to find an $\epsilon$-stationary point and that of NSCG-S under additional, yet less stringent than existing, mean-pth moment Lipschitzness and mean-$p$th moment smoothness. The sample complexity orders derived in this paper are competitive with the state-of-the-art results for first-order methods in single-level non-convex stochastic optimization under heavy-tailed noise. Finally, we report numerical experiments results showcasing the effectiveness of the proposed methods.