A very simple first-order algorithm is proposed for solving nonlinear optimization problems with deterministic nonlinear equality constraints. This algorithm adaptively selects steps in the plane tangent to the constraints or steps that reduce infeasibility, without using a merit function or a filter. The tangent steps are based on the AdaGrad method for unconstrained minimization. The objective function is never evaluated by the algorithm, making it suitable for noisy problems. Its worst-case evaluation complexity is analyzed, yielding a global convergence rate in O(1/sqrt{k}), which matches the best known rate of first-order methods for unconstrained problems. Numerical experiments are presented suggesting that the performance of the algorithm is comparable to that of first-order methods for unconstrained problems, and that its reliability is remarkably stable in the presence of noise on the gradient.