The if-then Polytope: Conditional Relations over Multiple Sets of Binary Variables

Inspired by its occurrence as a substructure in a stochastic railway timetabling model, we study in this work a special case of the bipartite boolean quadric polytope. It models conditional relations across three sets of binary variables, where selections within two “if” sets imply a choice in a corresponding “then” set. We call this polytope … Read more

A Parametric Approach for Solving Convex Quadratic Optimization with Indicators Over Trees

This paper investigates convex quadratic optimization problems involving $n$ indicator variables, each associated with a continuous variable, particularly focusing on scenarios where the matrix $Q$ defining the quadratic term is positive definite and its sparsity pattern corresponds to the adjacency matrix of a tree graph. We introduce a graph-based dynamic programming algorithm that solves this … Read more

A Proximal-Gradient Method for Constrained Optimization

We present a new algorithm for solving optimization problems with objective functions that are the sum of a smooth function and a (potentially) nonsmooth regularization function, and nonlinear equality constraints. The algorithm may be viewed as an extension of the well-known proximal-gradient method that is applicable when constraints are not present. To account for nonlinear … Read more

Strengthening Lasserre’s Hierarchy in Real and Complex Polynomial Optimization

We establish a connection between multiplication operators and shift operators. Moreover, we derive positive semidefinite conditions of finite rank moment sequences and use these conditions to strengthen Lasserre’s hierarchy for real and complex polynomial optimization. Integration of the strengthening technique with sparsity is considered. Extensive numerical experiments show that our strengthening technique can significantly improve … Read more

Revisiting the fitting of the Nelson-Siegel and Svensson models

The Nelson-Siegel and the Svensson models are two of the most widely used models for the term structure of interest rates. Even though the models are quite simple and intuitive, fitting them to market data is numerically challenging and various difficulties have been reported. In this paper, a novel mathematical analysis of the fitting problem … Read more

Slow convergence of the moment-SOS hierarchy for an elementary polynomial optimization problem

We describe a parametric univariate quadratic optimization problem for which the moment-SOS hierarchy has finite but increasingly slow convergence when the parameter tends to its limit value. We estimate the order of finite convergence as a function of the parameter. Article Download View Slow convergence of the moment-SOS hierarchy for an elementary polynomial optimization problem

Stochastic Aspects of Dynamical Low-Rank Approximation in the Context of Machine Learning

The central challenges of today’s neural network architectures are the prohibitive memory footprint and the training costs associated with determining optimal weights and biases. A large portion of research in machine learning is therefore dedicated to constructing memory-efficient training methods. One promising approach is dynamical low-rank training (DLRT) which represents and trains parameters as a … Read more

Model Construction for Convex-Constrained Derivative-Free Optimization

We develop a new approximation theory for linear and quadratic interpolation models, suitable for use in convex-constrained derivative-free optimization (DFO). Most existing model-based DFO methods for constrained problems assume the ability to construct sufficiently accurate approximations via interpolation, but the standard notions of accuracy (designed for unconstrained problems) may not be achievable by only sampling … Read more

A Unified Approach for Maximizing Continuous $\gamma$-weakly DR-submodular Functions

\(\) This paper presents a unified approach for maximizing continuous \(\gamma\)-weakly DR-submodular functions that encompasses a range of settings and oracle access types. Our approach includes a Frank-Wolfe type offline algorithm for both monotone and non-monotone functions, with different restrictions on the convex feasible region. We consider settings where the oracle provides access to either … Read more

Novel stepsize for some accelerated and stochastic optimization methods

New first-order methods now need to be improved to keep up with the constant developments in machine learning and mathematics. They are commonly used methods to solve optimization problems. Among them, the algorithm branch based on gradient descent has developed rapidly with good results achieved. Not out of that trend, in this article, we research … Read more