Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Kernel Projection Classifiers with Suppressing Features of Other Classes

View through CrossRef
We propose a new classification method based on a kernel technique called suppressed kernel sample space projection classifier (SKSP), which is extended from kernel sample space projection classifier (KSP). In kernel methods, samples are classified after they are mapped from an input space to a high-dimensional space called a feature space. The space that is spanned by samples of a class in a feature space is defined as a kernel sample space. In KSP, an unknown input vector is classified to the class of which projection norm onto the kernel sample space is maximized. KSP can be interpreted as a special type of kernel principal component analysis (KPCA). KPCA is also used in classification problems. However, KSP has more useful properties compared with KPCA, and its accuracy is almost the same as or better than that of KPCA classifier. Since KSP is a single-class classifier, it uses only self-class samples for learning. Thus, for a multiclass classification problem, even though there are very many classes, the computational cost does not change for each class. However, we expect that more useful features can be obtained for classification if samples from other classes are used. By extending KSP to SKSP, the effects of other classes are suppressed, and useful features can be extracted with an oblique projection. Experiments on two-class classification problems, indicate that SKSP shows high accuracy in many classification problems.
Title: Kernel Projection Classifiers with Suppressing Features of Other Classes
Description:
We propose a new classification method based on a kernel technique called suppressed kernel sample space projection classifier (SKSP), which is extended from kernel sample space projection classifier (KSP).
In kernel methods, samples are classified after they are mapped from an input space to a high-dimensional space called a feature space.
The space that is spanned by samples of a class in a feature space is defined as a kernel sample space.
In KSP, an unknown input vector is classified to the class of which projection norm onto the kernel sample space is maximized.
KSP can be interpreted as a special type of kernel principal component analysis (KPCA).
KPCA is also used in classification problems.
However, KSP has more useful properties compared with KPCA, and its accuracy is almost the same as or better than that of KPCA classifier.
Since KSP is a single-class classifier, it uses only self-class samples for learning.
Thus, for a multiclass classification problem, even though there are very many classes, the computational cost does not change for each class.
However, we expect that more useful features can be obtained for classification if samples from other classes are used.
By extending KSP to SKSP, the effects of other classes are suppressed, and useful features can be extracted with an oblique projection.
Experiments on two-class classification problems, indicate that SKSP shows high accuracy in many classification problems.

Related Results

Like Me or Like Us
Like Me or Like Us
Research has shown abundant evidence for social projection, that is, the tendency to expect similarity between oneself and others ( Krueger, 1998a , 1998b ). This effect is stronge...
Comparative analysis of information tendency and application features for projection mapping technologies at cultural heritage sites
Comparative analysis of information tendency and application features for projection mapping technologies at cultural heritage sites
AbstractWith the rapid development of interactive technologies using projection mapping (PJM), these digital technologies have introduced new interpretative possibilities for the p...
From specific-source feature-based to common-source score-based likelihood-ratio systems: ranking the stars
From specific-source feature-based to common-source score-based likelihood-ratio systems: ranking the stars
  This article studies expected performance and practical feasibility of the most commonly used classes of source-level likelihood-ratio (LR) systems when applied to...
Projection in Interaction and Projection in Grammar
Projection in Interaction and Projection in Grammar
AbstractIn this paper, I argue that there are fundamental common features shared by interaction and grammar that suggest some kind of interdependence between the two and a nonauton...
Definiteness projection
Definiteness projection
AbstractWe argue that definite noun phrases give rise to uniqueness inferences characterized by a pattern we calldefiniteness projection. Definiteness projection says that the uniq...
Combining Exploratory Projection Pursuit and Projection Pursuit Regression with Application to Neural Networks
Combining Exploratory Projection Pursuit and Projection Pursuit Regression with Application to Neural Networks
We present a novel classification and regression method that combines exploratory projection pursuit (unsupervised training) with projection pursuit regression (supervised training...
Noise Robust Projection Rule for Klein Hopfield Neural Networks
Noise Robust Projection Rule for Klein Hopfield Neural Networks
Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural net...

Back to Top