In this paper, we revisit the convergence theory of the inexact restoration paradigm for non-linear optimization. The paper first identifies the basic elements of a globally convergent method based on merit functions. Then, the inexact restoration method that employs a two-phase iteration is introduced as a special case. A specific implementation is presented that is supported in the solution of regularized subproblems. The proposed inexact restoration method includes more freedom in the computation of iterates and a novel procedure that integrates the computation of the penalty parameter with the optimization phase. Additionally, a way to speed up the proposed method by solving a quadratic programming subproblem is proposed. An alternative interpretation of the presented method would be to say that the method consists of a globally convergent sequential quadratic programming method that divides the iteration into two phases (feasibility and optimality) when the quadratic subproblem is infeasible. Theoretical results include asymptotic convergence theory as well as a worst-case iteration and evaluation complexity analysis of the introduced methods. The paper concludes by presenting numerical experiments that illustrate the practical application of the proposed methods.