We consider the problem of fitting a polynomial function to a set of data points, each data point consisting of a feature vector and a response variable. In contrast to standard polynomial regression, we require that the polynomial regressor satisfy shape constraints, such as monotonicity, Lipschitz-continuity, or convexity. We show how to use semidefinite programming to obtain polynomial regressors that have these properties. We then prove that, under some assumptions on the generation of the data points, the regressor obtained is a consistent estimator of the underlying shape-constrained function. We follow up with a thorough empirical comparison of our regressor to the convex least squares estimator introduced in [Hildreth 1954, Holloway 1979] and show that our regressor can be very valuable in settings where the number of data points is large and where new predictions need to be made quickly and often. We also propose a method that relies on linear and second-order cone programs to quickly update our regressor when a new batch of data points is provided. We conclude with two novel applications. The first application aims to approximate the function that maps a conic program's data to its optimal value. This enables us to obtain quick estimations of the optimal value without solving the conic program, which can be useful for real-time decision-making. We illustrate this on an example in inventory management contract negotiation. In the second application, we compute optimal transport maps using shape constraints as regularizers following [Paty 2020], and show, via a color transfer example, that this is a setting in which our regressor significantly outperforms other methods.
Article
View Shape-Constrained Regression using Sum of Squares Polynomials