In this article we present a version of the proximal alternating direction method for a convex problem with linear constraints and a separable objective function, in which the standard quadratic regularizing term is replaced with an interior proximal metric for those variables that are required to satisfy some additional convex constraints. Moreover, the proposed method has the advantage that the iterates are computed only approximately. Under standard assumptions, global convergence of the primal-dual sequences computed by the algorithm is established. Finally, we report some numerical experiments applied to statistical learning problems to illustrate the behavior of our algorithm.
Citation
Unpublished. Submitted to Journal. December/2015.