We study a class of polynomial optimization problems with a robust polynomial matrix
inequality constraint for which the uncertainty set is defined also by a polynomial matrix inequality (including robust polynomial semidefinite programs as a special case).
Under certain SOS-convexity assumptions, we construct a hierarchy of moment-SOS relaxations
for this problem to obtain convergent upper bounds of the optimal value by solving
a sequence of semidefinite programs.
To this end, we apply the Positivstellensatz for polynomial matrices and its dual
matrix-valued moment theory to a conic reformulation of the problem. Most of the nice features of the moment-SOS hierarchy for the scalar polynomial optimization are generalized to the matrix case. In particular, the finite convergence of the hierarchy can be also certified if the flat extension condition holds.
To extract global minimizers in this case, we develop a linear algebra approach to recover the representing matrix-valued measure for the corresponding truncated
matrix-valued moment problem. As an application, we use this hierarchy to solve the problem of minimizing the smallest eigenvalue of a polynomial matrix subject to a polynomial matrix inequality.
Finally, if SOS-convexity is replaced by convexity, we can still approximate the optimal value as closely as desired by solving a sequence of semidefinite programs, and certify global optimality in case that certain flat extension conditions hold true.
View A Moment-SOS Hierarchy for Robust Polynomial Matrix Inequality Optimization with SOS-Convexity