A Multivariate Loss Ratio Approach for Systemic Risk Measurement and Allocation

The primary challenges in systemic risk measurement involve determining an overall reserve level of risk capital and allocating it to different components based on their systemic relevance. In this paper, we introduce a multivariate loss ratio measure (MLRM), which is the minimum amount of capital to be injected into a financial system such that the … Read more

Sequential test sampling for stochastic derivative-free optimization

In many derivative-free optimization algorithms, a sufficient decrease condition decides whether to accept a trial step in each iteration. This condition typically requires that the potential objective function value decrease of the trial step, i.e., the true reduction in the objective function value that would be achieved by moving from the current point to the … Read more

Bregman Douglas-Rachford Splitting Method

In this paper, we propose the Bregman Douglas-Rachford splitting (BDRS) method and its variant Bregman Peaceman-Rachford splitting method for solving maximal monotone inclusion problem. We show that BDRS is equivalent to a Bregman alternating direction method of multipliers (ADMM) when applied to the dual of the problem. A special case of the Bregman ADMM is … Read more

Extracting Alternative Solutions from Benders Decomposition

We show how to extract alternative solutions for optimization problems solved by Benders Decom- position. In practice, alternative solutions provide useful insights for complex applications; some solvers do support generation of alternative solutions but none appear to support such generation when using Benders Decomposition. We propose a new post-processing method that extracts multiple optimal and … Read more

Gradient Methods with Online Scaling Part II. Practical Aspects

Part I of this work [Gao25] establishes online scaled gradient methods (OSGM), a framework that utilizes online convex optimization to adapt stepsizes in gradient methods. This paper focuses on the practical aspects of OSGM. We leverage the OSGM framework to design new adaptive first-order methods and provide insights into their empirical behavior. The resulting method, … Read more

On the convergence rate of the Douglas-Rachford splitting algorithm

This work is concerned with the convergence rate analysis of the Dou- glas–Rachford splitting (DRS) method for finding a zero of the sum of two maximally monotone operators. We obtain an exact rate of convergence for the DRS algorithm and demonstrate its sharpness in the setting of convex feasibility problems. Further- more, we investigate the … Read more

A Minimalist Bayesian Framework for Stochastic Optimization

The Bayesian paradigm offers principled tools for sequential decision-making under uncertainty, but its reliance on a probabilistic model for all parameters can hinder the incorporation of complex structural constraints. We introduce a minimalist Bayesian framework that places a prior only on the component of interest, such as the location of the optimum. Nuisance parameters are … Read more

What is the Best Way to Do Something? A Discreet Tour of Discrete Optimization

In mathematical optimization, we want to find the best possible solution for a decision-making problem. Curiously, these problems are harder to solve if they have discrete decisions. Imagine that you would like to buy chocolate: you can buy no chocolate or one chocolate bar, but typically you cannot buy just half of a bar. Now … Read more

When Wasserstein DRO Reduces Exactly: Complete Characterization, Projection Equivalence, and Regularization

Wasserstein distributionally robust optimization (DRO), a leading paradigm in data-driven decision-making, requires evaluating worst-case risk over a high-dimensional Wasserstein ball. We study when this worst-case evaluation admits an exact reduction to a one-dimensional formulation, in the sense that it can be carried out over a one-dimensional Wasserstein ball centered at the projected reference distribution. We … Read more