The Alternating Minimization Algorithm (AMA) has been proposed by Tseng to solve convex programming problems with two-block separable linear constraints and objectives, whereby (at least) one of the components of the latter is assumed to be strongly convex. The fact that one of the subproblems to be solved within the iteration process of AMA does not usually correspond to the calculation of a proximal operator through a closed formula, affects the implementability of the algorithm. In this paper we allow in each block of the objective a further smooth convex function and propose a proximal version of AMA, called Proximal AMA, which is achieved by equipping the algorithm with proximal terms induced by variable metrics. For suitable choices of the latter, the solving of the two subproblems in the iterative scheme can be reduced to the computation of proximal operators. We investigate the convergence of the proposed algorithm in a real Hilbert space setting and illustrate its numerical performances on two applications in image processing and machine learning.
View The Proximal Alternating Minimization Algorithm for two-block separable convex optimization problems with linear constraints