We provide new insight into a generalized conditional subgradient algorithm and a generalized mirror descent algorithm for the convex minimization problem \[\min_x \; \{f(Ax) + h(x)\}.\] As Bach showed in [SIAM J. Optim., 25 (2015), pp. 115–129], applying either of these two algorithms to this problem is equivalent to applying the other one to its Fenchel dual. We leverage this duality relationship to develop new upper bounds and convergence results for the gap between the primal and dual iterates generated by these two algorithms. We also propose a new primal-dual hybrid algorithm that combines features of the conditional subgradient and mirror descent algorithms to solve the primal and dual problems in a symmetric fashion. Our algorithms and main results rely only on the availability of computable oracles for $\partial f$ and $\partial h^*$, and for $A$ and $A^*$.
Citation
Working Paper, Tepper School of Business, Carnegie Mellon University, March 2019