We consider solving huge-scale instances of (convex) conic linear optimization problems, at the scale where matrix-factorization-free methods are attractive or necessary. The restarted primal-dual hybrid gradient method (rPDHG) -- with heuristic enhancements and GPU implementation -- has been very successful in solving huge-scale linear programming (LP) problems; however its application to more general conic convex optimization problems is not so well-studied. We analyze the theoretical and practical performance of rPDHG for general (convex) conic linear optimization, and LP as a special case thereof. We show a relationship between the geometry of the primal-dual (sub-)level sets W_eps and the convergence rate of rPDHG. Specifically, we prove a bound on the convergence rate of rPDHG that improves when there is a primal-dual (sub-)level set W_eps for which (i) W_eps is close to the optimal solution set (in Hausdorff distance), and (ii) the ratio of the diameter to the "conic radius" of W_eps is small. And in the special case of LP problems, the performance of rPDHG is bounded only by this ratio applied to the (sub-)level set corresponding to the best non-optimal extreme point. Depending on the problem instance, this ratio can take on extreme values and can result in poor performance of rPDHG both in theory and in practice. To address this issue, we show how central-path-based linear transformations -- including conic rescaling -- can markedly enhance the convergence rate of rPDHG. Furthermore, we present computational results that demonstrate how such rescalings can accelerate convergence to high-accuracy solutions, and lead to more efficient methods for huge-scale linear optimization problems.
Article
View The Role of Level-Set Geometry on the Performance of PDHG for Conic Linear Optimization