How do exponential size solutions arise in semidefinite programming?

Semidefinite programs (SDPs) are some of the most popular and broadly applicable optimization problems to emerge in the last thirty years. A curious pathology of SDPs, illustrated by a classical example of Khachiyan, is that their solutions may need exponential space to even write down. Exponential size solutions are the main obstacle to solve a … Read more

Cutting Plane Generation Through Sparse Principal Component Analysis

Quadratically-constrained quadratic programs (QCQPs) are optimization models whose remarkable expressiveness has made them a cornerstone of methodological research for nonconvex optimization problems. However, modern methods to solve a general QCQP fail to scale, encountering computational challenges even with just a few hundred variables. Specifically, a semidefinite programming (SDP) relaxation is typically employed, which provides strong … Read more

Implicit Regularization of Sub-Gradient Method in Robust Matrix Recovery: Don’t be Afraid of Outliers

It is well-known that simple short-sighted algorithms, such as gradient descent, generalize well in the over-parameterized learning tasks, due to their implicit regularization. However, it is unknown whether the implicit regularization of these algorithms can be extended to robust learning tasks, where a subset of samples may be grossly corrupted with noise. In this work, … Read more

New notions of simultaneous diagonalizability of quadratic forms with applications to QCQPs

A set of quadratic forms is simultaneously diagonalizable via congruence (SDC) if there exists a basis under which each of the quadratic forms is diagonal. This property appears naturally when analyzing quadratically constrained quadratic programs (QCQPs) and has important implications in this context. This paper extends the reach of the SDC property by studying two … Read more

A Matrix-Free Trust-Region Newton Algorithm for Convex-Constrained Optimization

We describe a matrix-free trust-region algorithm for solving convex-constrained optimization problems that uses the spectral projected gradient method to compute trial steps. To project onto the intersection of the feasible set and the trust region, we reformulate and solve the dual projection problem as a one-dimensional root finding problem. We demonstrate our algorithm’s performance on … Read more

ALESQP: An augmented Lagrangian equality-constrained SQP method for optimization with general constraints

We present a new algorithm for infinite-dimensional optimization with general constraints, called ALESQP. In short, ALESQP is an augmented Lagrangian method that penalizes inequality constraints and solves equality-constrained nonlinear optimization subproblems at every iteration. The subproblems are solved using a matrix-free trust-region sequential quadratic programming (SQP) method that takes advantage of iterative, i.e., inexact linear … Read more

Exponential Decay of Sensitivity in Graph-Structured Nonlinear Programs

We study solution sensitivity for nonlinear programs (NLPs) whose structure is induced by a graph $\mathcal{G}=(\mathcal{V},\mathcal{E})$. These graph-structured NLPs arise in many applications such as dynamic optimization, stochastic optimization, optimization with partial differential equations, and network optimization. We show that the sensitivity of the primal-dual solution at node $i\in \mathcal{V}$ against a data perturbation at … Read more

Dynamic string-averaging CQ-methods for the split feasibility problem with percentage violation constraints arising in radiation therapy treatment planning

We study a feasibility-seeking problem with percentage violation constraints. These are additional constraints, that are appended to an existing family of constraints, which single out certain subsets of the existing constraints and declare that up to a specified fraction of the number of constraints in each subset is allowed to be violated by up to … Read more

An (s^r)hBcResolution ODE Framework for Understanding Discrete-Time Algorithms and Applications to the Linear Convergence of Minimax Problems

There has been a long history of using ordinary differential equations (ODEs) to understand the dynamic of discrete-time algorithms (DTAs). Surprisingly, there are still two fundamental and unanswered questions: (i) it is unclear how to obtain a \emph{suitable} ODE from a given DTA, and (ii) it is unclear the connection between the convergence of a … Read more

Optimization with learning-informed differential equation constraints and its applications

Inspired by applications in optimal control of semilinear elliptic partial differential equations and physics-integrated imaging, differential equation constrained optimization problems with constituents that are only accessible through data-driven techniques are studied. A particular focus is on the analysis and on numerical methods for problems with machine-learned components. For a rather general context, an error analysis … Read more