Hager-Zhang Active Set Algorithm for Large-Scale Continuous Knapsack Problems

The structure of many real-world optimization problems includes minimization of a nonlinear (or quadratic) functional subject to bound and singly linear constraints (in the form of either equality or bilateral inequality) which are commonly called as continuous knapsack problems. Since there are efficient methods to solve large-scale bound constrained nonlinear programs, it is desirable to … Read more

Sparse Signal Reconstruction via Iterative Support Detection

We present a novel sparse signal reconstruction method “ISD”, aiming to achieve fast reconstruction and a reduced requirement on the number of measurements compared to the classical l_1 minimization approach. ISD addresses failed reconstructions of l_1 minimization due to insufficient measurements. It estimates a support set I from a current reconstruction and obtains a new … Read more

Compressed Sensing: How sharp is the RIP?

Consider a measurement matrix A of size n×N, with n < N, y a signal in R^N, and b = Ay the observed measurement of the vector y. From knowledge of (b,A), compressed sensing seeks to recover the k-sparse x, k < n, which minimizes ||b-Ax||. Using various methods of analysis — convex polytopes, geometric … Read more

Phase Transitions for Greedy Sparse Approximation Algorithms

A major enterprise in compressed sensing and sparse approximation is the design and analysis of computationally tractable algorithms for recovering sparse, exact or approximate, solutions of underdetermined linear systems of equations. Many such algorithms have now been proven using the ubiquitous Restricted Isometry Property (RIP) [9] to have optimal-order uniform recovery guarantees. However, it is … Read more

Code verification by static analysis: a mathematical programming approach

Automatic verification of computer code is of paramount importance in embedded systems supplying essential services. One of the most important verification techniques is static code analysis by abstract interpretation: the concrete semantics of a programming language (i.e.the values $\chi$ that variable symbols {\tt x} appearing in a program can take during its execution) are replaced … Read more

Prediction of the binding affinities of peptides to class II MHC using a regularized thermodynamic model

The binding of peptide fragments of extracellular peptides to class II MHC is a crucial event in the adaptive immune response. Each MHC allotype generally binds a distinct subset of peptides and the enormous number of possible peptide epitopes prevents their complete experimental characterization. Computational methods can utilize the limited experimental data to predict the … Read more

Transmission Expansion Planning with Re-design

Expanding an electrical transmission network requires heavy investments that need to be carefully planned, often at a regional or national level. We study relevant theoretical and practical aspects of transmission expansion planning, set as a bilinear programming problem with mixed 0-1 variables. We show that the problem is NP-hard and that, unlike the so-called Network … Read more

Quasi-Newton methods on Grassmannians and multilinear approximations of tensors

In this paper we proposed quasi-Newton and limited memory quasi-Newton methods for objective functions defined on Grassmannians or a product of Grassmannians. Specifically we defined BFGS and L-BFGS updates in local and global coordinates on Grassmannians or a product of these. We proved that, when local coordinates are used, our BFGS updates on Grassmannians share … Read more

SINCO – a greedy coordinate ascent method for sparse inverse covariance selection problem

In this paper, we consider the sparse inverse covariance selection problem which is equivalent to structure recovery of a Markov Network over Gaussian variables. We introduce a simple but efficient greedy algorithm, called {\em SINCO}, for solving the Sparse INverse COvariance problem. Our approach is based on coordinate ascent method which naturally preserves the sparsity … Read more

Machine Learning for Global Optimization

In this paper we introduce the LeGO (Learning for Global Optimization) approach for global optimization in which machine learning is used to predict the outcome of a computationally expensive global optimization run, based upon a suitable training performed by standard runs of the same global optimization method. We propose to use a Support Vector Machine … Read more