A superlinearly convergent subgradient method for sharp semismooth problems

Subgradient methods comprise a fundamental class of nonsmooth optimization algorithms. Classical results show that certain subgradient methods converge sublinearly for general Lipschitz convex functions and converge linearly for convex functions that grow sharply away from solutions. Recent work has moreover extended these results to certain nonconvex problems. In this work we seek to improve the … Read more

Lowner’s Operator and Spectral Functions in Euclidean Jordan Algebras

We study analyticity, differentiability, and semismoothness of Lowner’s operator and spectral functions under the framework of Euclidean Jordan algebras. In particular, we show that many optimization-related classical results in the symmetric matrix space can be generalized within this framework. For example, the metric projection operator over any symmetric cone defined in a Euclidean Jordan algebra … Read more

An Algorithm for Perturbed Second-order Cone Programs

The second-order cone programming problem is reformulated into several new systems of nonlinear equations. Assume the perturbation of the data is in a certain neighborhood of zero. Then starting from a solution to the old problem, the semismooth Newton’s iterates converge Q-quadratically to a solution of the perturbed problem. The algorithm is globalized. Numerical examples … Read more