Kernel Support Vector Regression with imprecise output

We consider a regression problem where uncertainty affects to the dependent variable of the elements of the database. A model based on the standard epsilon-Support Vector Regression approach is given, where two hyperplanes need to be constructed to predict the interval-valued dependent variable. By using the Hausdorff distance to measure the error between predicted and … Read more

Building separating concentric balls to solve a multi-instance classification problem

In this work, we consider a classification problem where the objects to be classified are bags of instances which are vectors measuring d different attributes. The classification rule is defined in terms of a ball, whose center and radius are the parameters to be computed. Given a bag, it is assigned to the positive class … Read more

Support Vector Regression for imprecise data

In this work, a regression problem is studied where the elements of the database are sets with certain geometrical properties. In particular, our model can be applied to handle data affected by some kind of noise or uncertainty and interval-valued data, and databases with missing values as well. The proposed formulation is based on the … Read more

Adaptive Constraint Reduction for Training Support Vector Machines

A support vector machine (SVM) determines whether a given observed pattern lies in a particular class. The decision is based on prior training of the SVM on a set of patterns with known classification, and training is achieved by solving a convex quadratic programming problem. Since there are typically a large number of training patterns, … Read more

Enclosing Machine Learning

This report introduces a new machine learning paradigm called enclosing machine learning for data mining. This novel method utilizes the virtues of human being’s cognition process and tries to imitate the two basic principles of cognition process from a macroscopical view, which are cognizing things of the same kind, recognizing things of a new kind … Read more

Classification problems with imprecise data through separating hyperplanes

We consider a supervised classification problem in which the elements to be classified are sets with certain geometrical properties. In particular, this model can be applied to deal with data affected by some kind of noise and in the case of interval-valued data. Two classification rules, a fuzzy one and a crisp one, are defined … Read more

Exploiting separability in large-scale linear support vector machine training

Linear support vector machine training can be represented as a large quadratic program. We present an efficient and numerically stable algorithm for this problem using interior point methods, which requires only O(n) operations per iteration. Through exploiting the separability of the Hessian, we provide a unified approach, from an optimization perspective, to 1-norm classification, 2-norm … Read more

Optimization for Simulation: LAD Accelerator

The goal of this paper is to address the problem of evaluating the performance of a system running under unknown values for its stochastic parameters. A new approach called LAD for Simulation, based on simulation and classification software, is presented. It uses a number of simulations with very few replications and records the mean value … Read more

A Coordinate Gradient Descent Method for Linearly Constrained Smooth Optimization and Support Vector Machines Training

Support vector machines (SVMs) training may be posed as a large quadratic program (QP) with bound constraints and a single linear equality constraint. We propose a (block) coordinate gradient descent method for solving this problem and, more generally, linearly constrained smooth optimization. Our method is closely related to decomposition methods currently popular for SVM training. … Read more