Tilt Stability on Riemannian Manifolds with Application to Convergence Analysis of Generalized Riemannian Newton Method

We generalize tilt stability, a fundamental concept in perturbation analysis of optimization problems in Euclidean spaces, to the setting of Riemannian manifolds. We prove the equivalence of the following conditions: Riemannian tilt stability, Riemannian variational strong convexity, Riemannian uniform quadratic growth, local strong monotonicity of Riemannian subdifferential, strong metric regularity of Riemannian subdifferential, and positive … Read more

Stability analysis of parameterized models relative to nonconvex constraints

For solution mappings of parameterized models (such as optimization problems, variational inequalities, and generalized equations), standard stability inevitably fails as the parameter approaches the boundary of the feasible domain. One remedy is relative stability restricted to a constraint set (e.g., the feasible domain), which is our focus in this paper. We establish generalized differentiation criteria … Read more

Newtonian Methods with Wolfe Linesearch in Nonsmooth Optimization and Machine Learning

This paper introduces and develops coderivative-based Newton methods with Wolfe linesearch conditions to solve various classes of problems in nonsmooth optimization and machine learning. We first propose a generalized regularized Newton method with Wolfe linesearch (GRNM-W) for unconstrained $C^{1,1}$ minimization problems (which are second-order nonsmooth) and establish global as well as local superlinear convergence of … Read more