This work refers to methods for solving convex-constrained monotone nonlinear equations. We first propose a framework, which is obtained by combining a safeguard strategy on the search directions with a notion of approximate projections. The global convergence of the framework is established under appropriate assumptions and some examples of methods which fall into this framework are presented. In particular, inexact versions of steepest descent-based, spectral gradient-like, Newton-like and limited memory BFGS methods are discussed. Numerical experiments illustrating the practical behavior of the algorithms are discussed and comparisons with existing methods are also presented.