In this paper we propose a randomized block coordinate non-monotone gradient (RBCNMG) method for minimizing the sum of a smooth (possibly nonconvex) function and a block-separable (possibly nonconvex nonsmooth) function. At each iteration, this method randomly picks a block according to any prescribed probability distribution and typically solves several associated proximal subproblems that usually have a closed-form solution, until a certain sufficient reduction on the objective function is achieved. We show that the sequence of expected objective values generated by this method converges to the expected value of the limit of the objective value sequence produced by a random single run. Moreover, the solution sequence generated by this method is arbitrarily close to an approximate stationary point with high probability. When the problem under consideration is convex, we further establish that the sequence of expected objective values generated by this method converges to the optimal value of the problem. In addition, the objective value sequence is arbitrarily close to the optimal value with high probability. We also conduct some preliminary experiments to test the performance of our RBCNMG method on the $\ell_1$-regularized least-squares problem. The computational results demonstrate that our method substantially outperform the randomized block coordinate descent method proposed in Richtarik and Takac (Math Programming 2012).