Mixed-Integer Rounding (MIR) cuts are effective at improving the dual bound in Mixed-Integer Linear Programming (MIP). However, in practice, MIR cuts are separated heuristically rather than using optimization as the latter is prohibitively expensive. We present a hybrid cut generation framework in which we train a Machine Learning (ML) model to inform cut generation for a family of similar instances. Our framework solves a MIP-based separation problem to generate high-quality MIR cuts, then learns to identify useful constraints that led to these effective cuts. At test time, the predictions of the ML model allow us to solve a reduced MIP-based separation problem. We present computational results for this approach on datasets of randomly perturbed MIPLIB2017 instances.