Javascript must be enabled to continue!
Kernel Projection Classifiers with Suppressing Features of Other Classes
View through CrossRef
We propose a new classification method based on a kernel technique called suppressed kernel sample space projection classifier (SKSP), which is extended from kernel sample space projection classifier (KSP). In kernel methods, samples are classified after they are mapped from an input space to a high-dimensional space called a feature space. The space that is spanned by samples of a class in a feature space is defined as a kernel sample space. In KSP, an unknown input vector is classified to the class of which projection norm onto the kernel sample space is maximized. KSP can be interpreted as a special type of kernel principal component analysis (KPCA). KPCA is also used in classification problems. However, KSP has more useful properties compared with KPCA, and its accuracy is almost the same as or better than that of KPCA classifier. Since KSP is a single-class classifier, it uses only self-class samples for learning. Thus, for a multiclass classification problem, even though there are very many classes, the computational cost does not change for each class. However, we expect that more useful features can be obtained for classification if samples from other classes are used. By extending KSP to SKSP, the effects of other classes are suppressed, and useful features can be extracted with an oblique projection. Experiments on two-class classification problems, indicate that SKSP shows high accuracy in many classification problems.
Title: Kernel Projection Classifiers with Suppressing Features of Other Classes
Description:
We propose a new classification method based on a kernel technique called suppressed kernel sample space projection classifier (SKSP), which is extended from kernel sample space projection classifier (KSP).
In kernel methods, samples are classified after they are mapped from an input space to a high-dimensional space called a feature space.
The space that is spanned by samples of a class in a feature space is defined as a kernel sample space.
In KSP, an unknown input vector is classified to the class of which projection norm onto the kernel sample space is maximized.
KSP can be interpreted as a special type of kernel principal component analysis (KPCA).
KPCA is also used in classification problems.
However, KSP has more useful properties compared with KPCA, and its accuracy is almost the same as or better than that of KPCA classifier.
Since KSP is a single-class classifier, it uses only self-class samples for learning.
Thus, for a multiclass classification problem, even though there are very many classes, the computational cost does not change for each class.
However, we expect that more useful features can be obtained for classification if samples from other classes are used.
By extending KSP to SKSP, the effects of other classes are suppressed, and useful features can be extracted with an oblique projection.
Experiments on two-class classification problems, indicate that SKSP shows high accuracy in many classification problems.
Related Results
Sorghum Kernel Weight
Sorghum Kernel Weight
The influence of genotype and panicle position on sorghum [Sorghum bicolor (L.) Moench] kernel growth is poorly understood. In the present study, sorghum kernel weight (KW) differe...
Physicochemical Properties of Wheat Fractionated by Wheat Kernel Thickness and Separated by Kernel Specific Density
Physicochemical Properties of Wheat Fractionated by Wheat Kernel Thickness and Separated by Kernel Specific Density
ABSTRACTTwo wheat cultivars, soft white winter wheat Yang‐mai 11 and hard white winter wheat Zheng‐mai 9023, were fractionated by kernel thickness into five sections; the fractiona...
Genetic Variation in Potential Kernel Size Affects Kernel Growth and Yield of Sorghum
Genetic Variation in Potential Kernel Size Affects Kernel Growth and Yield of Sorghum
Large‐seededness can increase grain yield in sorghum [Sorghum bicolor (L.) Moench] if larger kernel size more than compensates for the associated reduction in kernel number. The ai...
Optimized global map projections for specific applications: the triptychial projection and the Spilhaus projection
Optimized global map projections for specific applications: the triptychial projection and the Spilhaus projection
<p>There is no perfect global map projection. A projection may be area preserving or conformal (shape preserving on small scales) in some regions, but it will inevita...
miR-409-3p represses
Cited2
at the evolutionary emergence of the callosal and corticospinal projections
miR-409-3p represses
Cited2
at the evolutionary emergence of the callosal and corticospinal projections
Abstract
Callosal projection neurons are a broad population of interhemispheric projection neurons that extend an axon across the corpus callosum...
A Recipe for "Blackened 'Other'"
A Recipe for "Blackened 'Other'"
When you sit down to eat your delicious meal, it's better that you don't know that most of what you are eating came off a plane from Miami. And before it got on a plane in Miami, w...
Cost-sensitive multi-kernel ELM based on reduced expectation kernel auto-encoder
Cost-sensitive multi-kernel ELM based on reduced expectation kernel auto-encoder
ELM (Extreme learning machine) has drawn great attention due its high training speed and outstanding generalization performance. To solve the problem that the long training time of...
Polyphenol Oxidase in Wheat Grain: Whole Kernel and Bran Assays for Total and Soluble Activity
Polyphenol Oxidase in Wheat Grain: Whole Kernel and Bran Assays for Total and Soluble Activity
ABSTRACTColor is a key quality trait of wheat products, and polyphenol oxidase (PPO) is implicated as playing a significant role in darkening and discoloration. In this study, tota...

