Optimality Conditions and Constraint Qualifications for Generalized Nash Equilibrium Problems and their Practical Implications

Generalized Nash Equilibrium Problems (GNEPs) are a generalization of the classic Nash Equilibrium Problems (NEPs), where each player’s strategy set depends on the choices of the other players. In this work we study constraint qualifications and optimality conditions tailored for GNEPs and we discuss their relations and implications for global convergence of algorithms. Surprisingly, differently … Read more

Mixed-Integer PDE-Constrained Optimal Control of Gas Networks

We develop a mixed-integer optimal control model with partial differential equation (PDE) constraints for gas transport networks, designed for controlling extreme state transitions, such as flow reversals. Our model shows how to combine binary compressor controls with PDE flow models. We model the flow of gas using a variant of the Euler equations, which we … Read more

Model and Discretization Error Adaptivity within Stationary Gas Transport Optimization

The minimization of operation costs for natural gas transport networks is studied. Based on a recently developed model hierarchy ranging from detailed models of instationary partial differential equations with temperature dependence to highly simplified algebraic equations, modeling and discretization error estimates are presented to control the overall error in an optimization method for stationary and … Read more

Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations

Multipoint secant and interpolation methods are effective tools for solving systems of nonlinear equations. They use quasi-Newton updates for approximating the Jacobian matrix. Owing to their ability to more completely utilize the information about the Jacobian matrix gathered at the previous iterations, these methods are especially efficient in the case of expensive functions. They are … Read more

On the local stability of semidefinite relaxations

In this paper we consider a parametric family of polynomial optimization problems over algebraic sets. Although these problems are typically nonconvex, tractable convex relaxations via semidefinite programming (SDP) have been proposed. Often times in applications there is a natural value of the parameters for which the relaxation will solve the problem exactly. We study conditions … Read more

MILP feasibility by nonlinear programming

We discuss a tightly feasible mixed-integer linear programs arising in the energy industry, for which branch-and-bound appears to be ineffective. We consider its hardness, measure the probability that randomly generated instances are feasible or almost feasible, and introduce heuristic solution methods based on relaxing different constraints of the problem. We show the computational efficiency of … Read more

On global minimizers of quadratic functions with cubic regularization

In this paper, we analyze some theoretical properties of the problem of minimizing a quadratic function with a cubic regularization term, arising in many methods for unconstrained and constrained optimization that have been proposed in the last years. First we show that, given any stationary point that is not a global solution, it is possible … Read more

Efficient Convex Optimization for Linear MPC

Model predictive control (MPC) formulations with linear dynamics and quadratic objectives can be solved efficiently by using a primal-dual interior-point framework, with complexity proportional to the length of the horizon. An alternative, which is more able to exploit the similarity of the problems that are solved at each decision point of linear MPC, is to … Read more

A note on using performance and data profiles for training algorithms

It is shown how to use the performance and data profile benchmarking tools to improve algorithms’ performance. An illustration for the BFO derivative-free optimizer suggests that the obtained gains are potentially significant. Citation ACM Transactions on Mathematical Software, 45:2 (2019), Article 20. Article Download View A note on using performance and data profiles for training … Read more

Block Coordinate Descent Almost Surely Converges to a Stationary Point Satisfying the Second-order Necessary Condition

Given a non-convex twice continuously differentiable cost function with Lipschitz continuous gradient, we prove that all of the block coordinate gradient descent, block mirror descent and proximal block coordinate descent methods converge to stationary points satisfying the second-order necessary condition, almost surely with random initialization. All our results are ascribed to the center-stable manifold theorem … Read more