We introduce an analysis framework for constructing optimal first-order primal-dual methods for the prototypical constrained convex optimization tem- plate. While this class of methods offers scalability advantages in obtaining nu- merical solutions, they have the disadvantage of producing sequences that are only approximately feasible to the problem constraints. As a result, it is theoretically challenging to compare the efficiency of different methods. To this end, we rig- orously prove in the worst-case that the convergence of primal objective residual in first-order primal-dual algorithms must compete with their constraint feasibil- ity convergence, and mathematically summarize this fundamental trade-off. We then provide a heuristic-free analysis recipe for constructing optimal first-order primal-dual algorithms that can obtain a desirable trade-off between the primal objective residual and feasibility gap and whose iteration convergence rates cannot be improved. Our technique obtains a smoothed estimate of the primal-dual gap and drives the smoothness parameters to zero while simultaneously minimizing the smoothed gap using problem first-order oracles.
Citation
Tech. Report, October 2015
Article
View An optimal first-order primal-dual gap reduction framework for constrained convex optimization