Accelerating Stochastic Sequential Quadratic Programming for Equality Constrained Optimization using Predictive Variance Reduction

In this paper, we propose a stochastic variance reduction method for solving equality constrained optimization problems. Specifically, we develop a method based on the sequential quadratic programming paradigm that utilizes gradient approximations via predictive variance reduction techniques. Under reasonable assumptions, we prove that a measure of first-order stationarity evaluated at the iterates generated by our … Read more

A derivative-free trust-funnel method for equality-constrained nonlinear optimization

In this work, we look into new derivative-free methods to solve equality-constrained optimization problems. Of particular interest, are the trust-region techniques, which have been investigated for the unconstrained and bound-constrained cases. For solving equality-constrained optimization problems, we introduce a derivative-free adaptation of the trust-funnel method combined with a self-correcting geometry scheme and present some encouraging … Read more


We generalize the Nelder-Mead simplex and LTMADS algorithms and, the frame based methods for function minimization to Riemannian manifolds. Examples are given for functions defined on the special orthogonal Lie group $\mathcal{SO}(n)$ and the Grassmann manifold $\mathcal{G}(n,k)$. Our main examples are applying the generalized LTMADS algorithm to equality constrained optimization problems and, to the Whitney … Read more


We present a general procedure for handling equality constraints in optimization problems that is of particular use in direct search methods. First we will provide the necessary background in differential geometry. In particular, we will see what a Riemannian manifold is, what a tangent space is, how to move over a manifold and how to … Read more

Nonlinear programming without a penalty function or a filter

A new method is introduced for solving equality constrained nonlinear optimization problems. This method does not use a penalty function, nor a barrier or a filter, and yet can be proved to be globally convergent to first-order stationary points. It uses different trust-regions to cope with the nonlinearities of the objective function and the constraints, … Read more

On Handling Free Variables in Interior-Point Methods for Conic Linear Optimization

We revisit a regularization technique of Meszaros for handling free variables within interior-point methods for conic linear optimization. We propose a simple computational strategy, supported by a global convergence analysis, for handling the regularization. Using test problems from benchmark suites and recent applications, we demonstrate that the modern code SDPT3 modified to incorporate the proposed … Read more

A Null Space Method for Solving System of Equations

We transform the system of nonlinear equations into a nonlinear programming problem, which is solved by null space algorithms. We do not use standard least square approach. We divide the equations into two groups. One group contains the equations that are treated as equality constraints. The square of other equations is regarded as objective function. … Read more