Distributionally and Adversarially Robust Logistic Regression via Intersecting Wasserstein Balls

Adversarially robust optimization (ARO) has become the de facto standard for training models to defend against adversarial attacks during testing. However, despite their robustness, these models often suffer from severe overfitting. To mitigate this issue, several successful approaches have been proposed, including replacing the empirical distribution in training with: (i) a worst-case distribution within an … Read more

It’s All in the Mix: Wasserstein Machine Learning with Mixed Features

Citation Belbasi R., Selvi A., Wiesemann W. (December 2023) It’s all in the mix: Wasserstein machine learning with mixed features. Preprint. Article Download View It's All in the Mix: Wasserstein Machine Learning with Mixed Features

Differential Privacy via Distributionally Robust Optimization

In recent years, differential privacy has emerged as the de facto standard for sharing statistics of datasets while limiting the disclosure of private information about the involved individuals. This is achieved by randomly perturbing the statistics to be published, which in turn leads to a privacy-accuracy trade-off: larger perturbations provide stronger privacy guarantees, but they … Read more

Wasserstein Logistic Regression with Mixed Features

Recent work has leveraged the popular distributionally robust optimization paradigm to combat overfitting in classical logistic regression. While the resulting classification scheme displays a promising performance in numerical experiments, it is inherently limited to numerical features. In this paper, we show that distributionally robust logistic regression with mixed (i.e., numerical and categorical) features, despite amounting … Read more

A Reformulation-Linearization Technique for Optimization over Simplices

We study non-convex optimization problems over simplices. We show that for a large class of objective functions, the convex approximation obtained from the Reformulation-Linearization Technique (RLT) admits optimal solutions that exhibit a sparsity pattern. This characteristic of the optimal solutions allows us to conclude that (i) a linear matrix inequality constraint, which is often added … Read more

Convex Maximization via Adjustable Robust Optimization

Maximizing a convex function over convex constraints is an NP-hard problem in general. We prove that such a problem can be reformulated as an adjustable robust optimization (ARO) problem where each adjustable variable corresponds to a unique constraint of the original problem. We use ARO techniques to obtain approximate solutions to the convex maximization problem. … Read more