It is well known that the eigenvalues of a real symmetric matrix are not everywhere differentiable. A classical result of Ky Fan states that each eigenvalue of a symmetric matrix is the difference of two convex functions. This directly implies that the eigenvalues of a symmetric matrix are semismooth everywhere. Based on a very recent result of the authors, it is further proved in this paper that the eigenvalues of a symmetric matrix are strongly semismooth everywhere. As an application, it is demonstrated how this result can be used to analyze the quadratic convergence of Newton's methods for solving inverse eigenvalue problems (IEPs). This, in a systematic way, not only extends the quadratic convergence results of Friedland, Nocedal and Overton [SIAM Journal on Numerical Analysis, Vol. 24, 1987, 634--667] and others, but also gives an affirmative answer to a conjecture made by Dai and Lancaster [Numerical Linear Algebra with Applications, Vol. 4, 1997, 1--21] for the quadratic convergence of Newton's methods for solving generalized IEPs with multiple eigenvalues.
Citation
Manuscript, Department of Decision Sciences / Department of Mathematics, National University of Singapore, Singapore 119260, Aug. 2001.