Numerical Optimization of Eigenvalues of Hermitian Matrix Functions

The eigenvalues of a Hermitian matrix function that depends on one parameter analytically can be ordered so that each eigenvalue is an analytic function of the parameter. Ordering these analytic eigenvalues from the largest to the smallest yields continuous and piece-wise analytic functions. For multi-variate Hermitian matrix functions that depend on $d$ parameters analytically, the ordered eigenvalues from the largest to the smallest are continuous and piece-wise analytic along lines in the $d$-dimensional space. These classical results imply the boundedness of the second derivatives of the pieces defining the sorted eigenvalue functions along any direction. We derive an algorithm based on the boundedness of these second derivatives for the global minimization of an eigenvalue of an analytic Hermitian matrix function. The algorithm, which is globally convergent, is driven by computing a global minimum of a piece-wise quadratic under-estimator for the eigenvalue function and refining the under-estimator in an iterative fashion. In the multi-variate case, the computation of such a global minimum can be decomposed into solving a finite number of nonconvex quadratic programming problems. The derivatives of the eigenvalue functions are used to construct quadratic models that yield rapid global convergence in comparison with traditional global optimization algorithms. The applications that we have in mind include the $H_{\infty}$ norm of a linear system, numerical radius, distance to uncontrollability, and distance to a nearest defective matrix.

Citation

Technical Report, Department of Mathematics, Koc University, Sariyer, 34450, Istanbul, Turkey