Cut-Sharing Across Trees and Efficient Sequential Sampling for SDDP with Uncertainty in the RHS

In this paper we show that when a multistage stochastic problem with stage-wise independent realizations has only RHS uncertainties, solving one tree provides a valid lower bound for all trees with the same number of scenarios per stage without any additional computational effort. The only change to the traditional algorithm is the way cuts are … Read more

Characterization of an Anomalous Behavior of a Practical Smoothing Technique

A practical smoothing method was analyzed and tested against state-of-the-art solvers for some non-smooth optimization problems in [BSS20a; BSS20b]. This method can be used to smooth the value functions and solution mappings of fully parameterized convex problems under mild conditions. In general, the smoothing of the value function lies from above the true value function … Read more

Decomposition Algorithms for Some Deterministic and Two-Stage Stochastic Single-Leader Multi-Follower Games

We consider a certain class of hierarchical decision problems that can be viewed as single-leader multi-follower games, and be represented by a virtual market coordinator trying to set a price system for traded goods, according to some criterion that balances supply and demand. The objective function of the market coordinator involves the decisions of many … Read more

A Regularized Smoothing Method for Fully Parameterized Convex Problems with Applications to Convex and Nonconvex Two-Stage Stochastic Programming

We present an approach to regularize and approximate solution mappings of parametric convex optimization problems that combines interior penalty (log-barrier) solutions with Tikhonov regularization. Because the regularized mappings are single-valued and smooth under reasonable conditions, they can be used to build a computationally practical smoothing for the associated optimal value function. The value function in … Read more

Convex Variational Formulations for Learning Problems

Abstract—In this article, we introduce new techniques to solve the nonlinear regression problem and the nonlinear classification problem. Our benchmarks suggest that our method for regression is significantly more effective when compared to classical methods and our method for classification is competitive. Our list of classical methods includes least squares, random forests, decision trees, boosted … Read more

Optimization Methods for Locating Heteroclinic Orbits

Assume we are given a system of ordinary differential equations x 0 = f(x, p) depending on a parameter p ∈ R pe . In this dissertation we consider the problem of locating a parameter p and an initial condition ξ that give rise to a heteroclinic orbit. In the case that such p and … Read more