We aim to factorize a completely positive matrix by using an optimization approach which consists in the minimization of a nonconvex smooth function over a convex and compact set. To solve this problem we propose a projected gradient algorithm with parameters that take into account the effects of relaxation and inertia. Both projection and gradient steps are simple in the sense that they have explicit formulas and do not require inner loops. Furthermore, no expensive procedure to find an appropriate starting point is needed. The convergence analysis shows that the whole sequence of generated iterates converges to a critical point of the objective function and it makes use of the Lojasiewicz inequality. Its rate of convergence expressed in terms of the Lojasiewicz exponent of a regularization of the objective function is also provided. Numerical experiments demonstrate the efficiency of the proposed method, in particular in comparison to other factorization algorithms, and emphasize the role of the relaxation and inertial parameters.
Citation
to appear in "Numerical Linear Algebra with Applications"