Efficient Optimization Algorithms for Robust Principal Component Analysis and Its Variants

Robust PCA has drawn significant attention in the last decade due to its success in numerous application domains, ranging from bio-informatics, statistics, and machine learning to image and video processing in computer vision. Robust PCA and its variants such as sparse PCA and stable PCA can be formulated as optimization problems with exploitable special structures. … Read more

Sensitivity Analysis for Nonlinear Programming in CasADi

We present an extension of the CasADi numerical optimization framework that allows arbitrary order NLP sensitivities to be calculated automatically and efficiently. The approach, which can be used together with any NLP solver available in CasADi, is based on a sparse QR factorization and an implementation of a primal-dual active set method. The whole toolchain … Read more

A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions

To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed in [1, 2]. But the suitable extra updates need to be selected carefully, since the repeating process can be time consuming. We show that for the limited-memory variable metric BNS method, matrix … Read more

A family of spectral gradient methods for optimization

We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the short Barzilai-Borwein (BB) stepsize and the long BB stepsize. It is shown that each member of the family shares certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as … Read more

Parallel and Distributed Successive Convex Approximation Methods for Big-Data Optimization

Recent years have witnessed a surge of interest in parallel and distributed optimization methods for large-scale systems. In particular, nonconvex large-scale optimization problems have found a wide range of applications in several engineering fields. The design and the analysis of such complex, large-scale, systems pose several challenges and call for the development of new optimization … Read more

Stable interior point method for convex quadratic programming with strict error bounds

We present a short step interior point method for solving a class of nonlinear programming problems with quadratic objective function. Convex quadratic programming problems can be reformulated as problems in this class. The method is shown to have weak polynomial time complexity. A complete proof of the numerical stability of the method is provided. No … Read more

An Envelope for Davis-Yin Splitting and Strict Saddle Point Avoidance

It is known that operator splitting methods based on Forward Backward Splitting (FBS), Douglas-Rachford Splitting (DRS), and Davis-Yin Splitting (DYS) decompose a difficult optimization problems into simpler subproblem under proper convexity and smoothness assumptions. In this paper, we identify an envelope (an objective function) whose gradient descent iteration under a variable metric coincides with DYS … Read more

A global hybrid derivative-free method for large-scale systems of nonlinear equations

This work concerns the numerical solution of large-scale systems of nonlinear equations, when derivatives are not available for use, but assuming that all functions defining the problem are continuously differentiable. A hybrid approach is taken, based on a derivative-free iterative method, organized in three phases. The first phase is defined by derivative-free versions of a … Read more

On a Frank-Wolfe Type Theorem in Cubic Optimization

A classical result due to Frank and Wolfe (1956) says that a quadratic function $f$ attains its supremum on a nonempty polyhedron $M$ if $f$ is bounded from above on $M$. In this note, we present a stringent proof of the extension of this result to cubic optimization (known from Andronov, Belousov and Shironin (1982)). … Read more

A Special Complementarity Function Revisited

Recently, a local framework of Newton-type methods for constrained systems of equations has been developed which, applied to the solution of Karush-KuhnTucker (KKT) systems, enables local quadratic convergence under conditions that allow nonisolated and degenerate KKT points. This result is based on a reformulation of the KKT conditions as a constrained piecewise smooth system of … Read more