Javascript must be enabled to continue!
Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
View through CrossRef
Kernelized elastic net regularization (KENReg) is a kernelization of the well-known elastic net regularization (Zou & Hastie, 2005 ). The kernel in KENReg is not required to be a Mercer kernel since it learns from a kernelized dictionary in the coefficient space. Feng, Yang, Zhao, Lv, and Suykens ( 2014 ) showed that KENReg has some nice properties including stability, sparseness, and generalization. In this letter, we continue our study on KENReg by conducting a refined learning theory analysis. This letter makes the following three main contributions. First, we present refined error analysis on the generalization performance of KENReg. The main difficulty of analyzing the generalization error of KENReg lies in characterizing the population version of its empirical target function. We overcome this by introducing a weighted Banach space associated with the elastic net regularization. We are then able to conduct elaborated learning theory analysis and obtain fast convergence rates under proper complexity and regularity assumptions. Second, we study the sparse recovery problem in KENReg with fixed design and show that the kernelization may improve the sparse recovery ability compared to the classical elastic net regularization. Finally, we discuss the interplay among different properties of KENReg that include sparseness, stability, and generalization. We show that the stability of KENReg leads to generalization, and its sparseness confidence can be derived from generalization. Moreover, KENReg is stable and can be simultaneously sparse, which makes it attractive theoretically and practically.
Title: Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
Description:
Kernelized elastic net regularization (KENReg) is a kernelization of the well-known elastic net regularization (Zou & Hastie, 2005 ).
The kernel in KENReg is not required to be a Mercer kernel since it learns from a kernelized dictionary in the coefficient space.
Feng, Yang, Zhao, Lv, and Suykens ( 2014 ) showed that KENReg has some nice properties including stability, sparseness, and generalization.
In this letter, we continue our study on KENReg by conducting a refined learning theory analysis.
This letter makes the following three main contributions.
First, we present refined error analysis on the generalization performance of KENReg.
The main difficulty of analyzing the generalization error of KENReg lies in characterizing the population version of its empirical target function.
We overcome this by introducing a weighted Banach space associated with the elastic net regularization.
We are then able to conduct elaborated learning theory analysis and obtain fast convergence rates under proper complexity and regularity assumptions.
Second, we study the sparse recovery problem in KENReg with fixed design and show that the kernelization may improve the sparse recovery ability compared to the classical elastic net regularization.
Finally, we discuss the interplay among different properties of KENReg that include sparseness, stability, and generalization.
We show that the stability of KENReg leads to generalization, and its sparseness confidence can be derived from generalization.
Moreover, KENReg is stable and can be simultaneously sparse, which makes it attractive theoretically and practically.
Related Results
Current therapeutic strategies for erectile function recovery after radical prostatectomy – literature review and meta-analysis
Current therapeutic strategies for erectile function recovery after radical prostatectomy – literature review and meta-analysis
Radical prostatectomy is the most commonly performed treatment option for localised prostate cancer. In the last decades the surgical technique has been improved and modified in or...
Joint sparse optimization: lower-order regularization method and application in cell fate conversion
Joint sparse optimization: lower-order regularization method and application in cell fate conversion
Abstract
Multiple measurement signals are commonly collected in practical applications, and joint sparse optimization adopts the synchronous effect within multiple m...
Analytical Solutions to Minimum-Norm Problems
Analytical Solutions to Minimum-Norm Problems
For G∈Rm×n and g∈Rm, the minimization min∥Gψ−g∥2, with ψ∈Rn, is known as the Tykhonov regularization. We transport the Tykhonov regularization to an infinite-dimensional setting, t...
Active Versus Passive Recovery During High Intensity Intermittent Treadmill Running in Collegiate Sprinters
Active Versus Passive Recovery During High Intensity Intermittent Treadmill Running in Collegiate Sprinters
Most studies on manipulating recovery variables during interval exercise have focused primarily on aerobic training and performances. It was the purpose of this study to investigat...
A multiple-parameter regularization approach for filtering monthly GRACE/GRACE-FO gravity models
A multiple-parameter regularization approach for filtering monthly GRACE/GRACE-FO gravity models
The Gravity Recovery and Climate Experiment (GRACE) and its subsequent GRACE Follow-On (GRACE-FO) missions have been instrumental in monitoring Earth’s mass changes throu...
Improving performance of deep learning models using 3.5D U-Net via majority voting for tooth segmentation on cone beam computed tomography
Improving performance of deep learning models using 3.5D U-Net via majority voting for tooth segmentation on cone beam computed tomography
AbstractDeep learning allows automatic segmentation of teeth on cone beam computed tomography (CBCT). However, the segmentation performance of deep learning varies among different ...
Difference in force decay latex and non-latex elastic band based on duration of elastic use and salivary pH concentration: In vitro study
Difference in force decay latex and non-latex elastic band based on duration of elastic use and salivary pH concentration: In vitro study
OBJECTIVE:
This study aimed to determine the difference in force decay between latex and non-latex elastic bands based on duration of use and salivary pH concentration....
Subexponential lower bounds for f-ergodic Markov processes
Subexponential lower bounds for f-ergodic Markov processes
AbstractWe provide a criterion for establishing lower bounds on the rate of convergence in f-variation of a continuous-time ergodic Markov process to its invariant measure. The cri...

