Moment-sos and spectral hierarchies for polynomial optimization on the sphere and quantum de Finetti theorems

We revisit the convergence analysis of two approximation hierarchies for polynomial optimization on the unit sphere. The first one is based on the moment-sos approach and gives semidefinite bounds for which Fang and Fawzi (2021) showed an analysis in \(O(1/r^2)\) for the r-th level bound, using the polynomial kernel method. The second hierarchy was recently … Read more

MadNCL: A GPU Implementation of Algorithm NCL for Large-Scale, Degenerate Nonlinear Programs

We present a GPU implementation of Algorithm NCL, an augmented Lagrangian method for solving large-scale and degenerate nonlinear programs. Although interior-point methods and sequential quadratic programming are widely used for solving nonlinear programs, the augmented Lagrangian method is known to offer superior robustness against constraint degeneracies and can rapidly detect infeasibility. We introduce several enhancements … Read more

Consolidation in Crowdshipping with Scheduled Transfer Lines: A Surrogate-Based Network Design Framework

Abstract: Crowdshipping has gained attention as an emerging delivery model thanks to advantages such as flexibility and an asset-light structure. Yet, it chronically suffers from a lackof mechanisms to create and exploit consolidation opportunities, limiting its efficiency and scalability. This work contributes to the literature in two ways: first, by introducing a novel consolidation concept … Read more

Optimal participation of energy communities in electricity markets under uncertainty. A multi-stage stochastic programming approach

We propose a multi-stage stochastic programming model for the optimal participation of energy communities in electricity markets. The multi-stage aspect captures the different times at which variable renewable generation and electricity prices are observed. This results in large-scale optimization problem instances containing large scenario trees with 34 stages, to which scenario reduction techniques are applied. … Read more

On vehicle routing problems with stochastic demands — Generic integer L-shaped formulations

We study a broad class of vehicle routing problems in which the cost of a route is allowed to be any nonnegative rational value computable in polynomial time in the input size. To address this class, we introduce a unifying framework that generalizes existing integer L-shaped (ILS) formulations developed for vehicle routing problems with stochastic … Read more

Continuous-time Analysis of a Stochastic ADMM Method for Nonconvex Composite Optimization

In this paper, we focus on nonconvex composite optimization, whose objective is the sum of a smooth but possibly nonconvex function and a composition of a weakly convex function coupled with a linear operator. By leveraging a smoothing technique based on Moreau envelope, we propose a stochastic proximal linearized ADMM algorithm (SPLA). To understand its … Read more

On Integer Programming for the Binarized Neural Network Verification Problem

Binarized neural networks (BNNs) are feedforward neural networks with binary weights and activation functions. In the context of using a BNN for classification, the verification problem seeks to determine whether a small perturbation of a given input can lead it to be misclassified by the BNN, and the robustness of the BNN can be measured … Read more

Progressively Sampled Equality-Constrained Optimization

An algorithm is proposed, analyzed, and tested for solving continuous nonlinear-equality-constrained optimization problems where the constraints are defined by an expectation or an average over a large (finite) number of terms. The main idea of the algorithm is to solve a sequence of equality-constrained problems, each involving a finite sample of constraint-function terms, over which … Read more

A Riemannian AdaGrad-Norm Method

We propose a manifold AdaGrad-Norm method (\textsc{MAdaGrad}), which extends the norm version of AdaGrad (AdaGrad-Norm) to Riemannian optimization. In contrast to line-search schemes, which may require several exponential map computations per iteration, \textsc{MAdaGrad} requires only one. Assuming the objective function $f$ has Lipschitz continuous Riemannian gradient, we show that the method requires at most $\mathcal{O}(\varepsilon^{-2})$ … Read more

On the Convergence and Properties of a Proximal-Gradient Method on Hadamard Manifolds

In this paper, we address composite optimization problems on Hadamard manifolds, where the objective function is given by the sum of a smooth term (not necessarily convex) and a convex term (not necessarily differentiable). To solve this problem, we develop a proximal gradient method defined directly on the manifold, employing a strategy that enforces monotonicity … Read more