The \emph{optimal value function} is one of the basic objects in the field of mathematical optimization, as it allows the evaluation of the variations in the \emph{cost/revenue} generated while \emph{minimizing/maximizing} a given function under some constraints. In the context of stability/sensitivity analysis, a large number of publications have been dedicated to the study of continuity and differentiability properties of the optimal value function. The differentiability aspect of works in the current literature has mostly been limited to first order analysis, with focus on estimates of its directional derivatives and subdifferentials, given that the function is typically nonsmooth. With the progress made in the last two to three decades in major subfields of optimization such as robust, minmax, semi-infinite and bilevel optimization, and their connection to the optimal value function, there is a crucial need for a \emph{second order analysis of the generalized differentiability properties} of this function. This type of analysis will promote the development of robust solution methods, such as the Newton method, which is very popular in nonlinear optimization. The main goal of this paper is to provide results in this direction. In fact, we derive estimates of the \emph{generalized Hessian} (also known as the second order subdifferential) for the optimal value function. Our results are based on two handy tools from parametric optimization, namely the optimal solution and Lagrange multiplier mappings, for which completely detailed estimates of their generalized derivatives are either well-known or can easily be obtained.

## Citation

arXiv:1710.05887

## Article

View Estimates of generalized Hessians for optimal value functions in mathematical programming