Trace Norm Regularization: Reformulations, Algorithms, and Multi-task Learning

We consider a recently proposed optimization formulation of multi-task learning based on trace norm regularized least squares. While this problem may be formulated as a semidefinite program (SDP), its size is beyond general SDP solvers. Previous solution approaches apply proximal gradient methods to solve the primal problem. We derive new primal and dual reformulations of … Read more

Covariance regularization in inverse space

This paper proposes to apply Gaussian graphical models to estimate the large-scale normal distribution in the context of data assimilation from a relatively small number of data from the satellite. Data assimilation is a field which fits simulation models to observation data developed mainly in meteorology and oceanography. The optimization problem tends to be huge … Read more

The L1-Norm Best-Fit Hyperplane Problem

We formalize an algorithm for solving the L1-norm best-fit hyperplane problem derived using first principles and geometric insights about L1 projection and L1 regression. The procedure follows from a new proof of global optimality and relies on the solution of a small number of linear programs. The procedure is implemented for validation and testing. This … Read more

An Improved Algorithm for the Solution of the Entire Regulation Path of Support Vector Machine

This paper describes an improved algorithm for the numerical solution to the Support Vector Machine (SVM) classification problem for all values of the regularization parameter, C. The algorithm is motivated by the work of Hastie \emph{et. al.} and follows the main idea of tracking the optimality conditions of the SVM solution for descending value of … Read more

An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems

The affine rank minimization problem, which consists of finding a matrix of minimum rank subject to linear equality constraints, has been proposed in many areas of engineering and science. A specific rank minimization problem is the matrix completion problem, in which we wish to recover a (low-rank) data matrix from incomplete samples of its entries. … Read more

Classification with Guaranteed Probability of Error

We introduce a general-purpose learning machine that we call the “Guaranteed Error Machine”, or GEM, and two learning algorithms, a “real GEM algorithm” and an “ideal GEM algorithm”. The real GEM algorithm is for use in real applications, while the ideal GEM algorithm is introduced as a theoretical tool; however, these two algorithms have identical … Read more

Hybrid MPI/OpenMP parallel support vector machine training

Support Vector Machines are a powerful machine learning technology, but the training process involves a dense quadratic optimization problem and is computationally challenging. A parallel implementation of Support Vector Machine training has been developed, using a combination of MPI and OpenMP. Using an interior point method for the optimization and a reformulation that avoids the … Read more

Support vector machines with the ramp loss and the hard margin loss

In the interest of deriving classifiers that are robust to outlier observations, we present integer programming formulations of Vapnik’s support vector machine (SVM) with the ramp loss and hard margin loss. The ramp loss allows a maximum error of 2 for each training observation, while the hard margin loss calculates error by counting the number … Read more

Efficient Algorithmic Techniques for Several Multidimensional Geometric Data Management and Analysis Problems

this paper I present several novel, efficient, algorithmic techniques for solving some multidimensional geometric data management and analysis problems. The techniques are based on several data structures from computational geometry (e.g. segment tree and range tree) and on the well-known sweep-line method. CitationProceedings of the 3rd International Conference “Knowledge Management – Projects, Systems and Technologies”, … Read more