Proximal Approaches for Matrix Optimization Problems: Application to Robust Precision Matrix Estimation.

In recent years, there has been a growing interest in mathematical mod- els leading to the minimization, in a symmetric matrix space, of a Bregman di- vergence coupled with a regularization term. We address problems of this type within a general framework where the regularization term is split in two parts, one being a spectral function while the other is arbitrary. A Douglas–Rachford approach is proposed to address such problems and a list of proximity operators is provided allowing us to consider various choices for the fit–to–data functional and for the regularization term. Based on our theoretical results, a novel approach is proposed for the noisy graphical lasso problem, where a precision matrix has to be statistically estimated in the presence of noise. The nonconvexity of the resulting objective function is dealt with a majorization–minimization approach by building a sequence of convex surrogates and solving the inner optimization subproblems via the aforementioned Douglas–Rachford procedure. We establish conditions for the convergence of this iterative scheme and we illustrate its good numerical performance with respect to state–of–the–art approaches.

Article

Download

View PDF