Global Stability Analysis of Fluid Flows using Sum-of-Squares

This paper introduces a new method for proving global stability of fluid flows through the construction of Lyapunov functionals. For finite dimensional approximations of fluid systems, we show how one can exploit recently developed optimization methods based on sum-of-squares decomposition to construct a polynomial Lyapunov function. We then show how these methods can be extended … Read more

Preconditioning and Globalizing Conjugate Gradients in Dual Space for Quadratically Penalized Nonlinear-Least Squares Problems

When solving nonlinear least-squares problems, it is often useful to regularize the problem using a quadratic term, a practice which is especially common in applications arising in inverse calculations. A solution method derived from a trust-region Gauss-Newton algorithm is analyzed for such applications, where, contrary to the standard algorithm, the least-squares subproblem solved at each … Read more

Quest for the control on the second order derivatives: topology optimization with functional includes the state’s curvature

Many physical phenomena, governed by partial differential equations (PDEs), are second order in nature. This makes sense to pose the control on the second order derivatives of the field solution, in addition to zero and first order ones, to consistently control the underlaying process. However, this type of control is nontrivial and to the best … Read more

Newton–Picard-Based Preconditioning for Linear-Quadratic Optimization Problems with Time-Periodic Parabolic PDE Constraints

We develop and investigate two preconditioners for a basic linear iterative splitting method for the numerical solution of linear-quadratic optimization problems with time-periodic parabolic PDE constraints. The resulting real-valued linear system to be solved is symmetric indefinite. We propose all-at-once symmetric indefinite preconditioners based on a Newton–Picard approach which divides the variable space into slow … Read more

Using approximate secant equations in limited memory methods for multilevel unconstrained optimization

The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behaviour of the objective function. Following earlier work by Gratton and Toint (2009), we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take … Read more

Asymptotic expansion for the solution of a penalized control constrained semilinear elliptic problems

In this work we consider the optimal control problem of a semilinear elliptic PDE with a Dirichlet boundary condition, where the control variable is distributed over the domain and is constrained to be nonnegative. The approach is to consider an associated parametrized family of penalized problems, whose solutions define a central path converging to the … Read more

Band Gap Optimization of Two-Dimensional Photonic Crystals Using Semidefinite Programming and Subspace Methods

In this paper, we consider the optimal design of photonic crystal band structures for two-dimensional square lattices. The mathematical formulation of the band gap optimization problem leads to an infinite-dimensional Hermitian eigenvalue optimization problem parametrized by the dielectric material and the wave vector. To make the problem tractable, the original eigenvalue problem is discretized using … Read more

Control problems with mixed constraints and application to an optimal investment problem

We discuss two optimal control problems of parabolic equations, with mixed state and control constraints, for which the standard qualification condition does not hold. Our first example is a bottleneck problem, and the second one is an optimal investment problem where a utility type function is to be minimized. By an adapted penalization technique, we … Read more

An Interior-Point Algorithm for Large-Scale Nonlinear Optimization with Inexact Step Computations

We present a line-search algorithm for large-scale continuous optimization. The algorithm is matrix-free in that it does not require the factorization of derivative matrices. Instead, it uses iterative linear system solvers. Inexact step computations are supported in order to save computational expense during each iteration. The algorithm is an interior-point approach derived from an inexact … Read more

The Advanced Step NMPC Controller: Optimality, Stability and Robustness

Widespread application of dynamic optimization with fast optimization solvers leads to increased consideration of first-principles models for nonlinear model predictive control (NMPC). However, significant barriers to this optimization-based control strategy are feedback delays and consequent loss of performance and stability due to on-line computation. To overcome these barriers, recently proposed NMPC controllers based on nonlinear … Read more