A Class of Smooth Exact Penalty Function Methods for Optimization Problems with Orthogonality Constraints

Updating the augmented Lagrangian multiplier by closed-form expression yields efficient first-order infeasible approach for optimization problems with orthogonality constraints. Hence, parallelization becomes tractable in solving this type of problems. Inspired by this closed-form updating scheme, we propose an exact penalty function model with compact convex constraints (PenC). We show that PenC can act as an exact penalty model which shares the same global minimizers as the original problem with orthogonality constraints. Based on PenC, we first propose a first-order algorithm called PenCF and establish its global convergence and local linear convergence rate under some mild assumptions. For the case that the computation and storage of Hessian is achievable, and we pursue high precision solution and fast local convergence rate, a second-order approach called PenCS is proposed for solving PenC. To avoid expensive calculation or solving a hard subproblem in computing the Newton step, we propose a new strategy to do it approximately which still leads to quadratic convergence locally. Moreover, the main iterations of both PenCF and PenCS are orthonormalization-free and hence parallelizable. Numerical experiments illustrate that PenCF is comparable with the existing first-order methods. Furthermore, PenCS shows its stability and high efficiency in obtaining high precision solution comparing with the existing second-order methods.

Citation

02/2020

Article

Download

View PDF