In this paper, we address composite optimization problems on Hadamard manifolds, where the objective function is given by the sum of a smooth term (not necessarily convex) and a convex term (not necessarily differentiable). To solve this problem, we develop a proximal gradient method defined directly on the manifold, employing a strategy that enforces monotonicity of the objective function values along the generated sequence. We investigate its convergence properties without imposing the Lipschitz continuity assumption on the gradient of the smooth component.