Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent

View through CrossRef
Abstract Background Statistical analyses of biological problems in life sciences often lead to high-dimensional linear models. To solve the corresponding system of equations, penalization approaches are often the methods of choice. They are especially useful in case of multicollinearity, which appears if the number of explanatory variables exceeds the number of observations or for some biological reason. Then, the model goodness of fit is penalized by some suitable function of interest. Prominent examples are the lasso, group lasso and sparse-group lasso. Here, we offer a fast and numerically cheap implementation of these operators via proximal gradient descent. The grid search for the penalty parameter is realized by warm starts. The step size between consecutive iterations is determined with backtracking line search. Finally, seagull -the R package presented here- produces complete regularization paths. Results Publicly available high-dimensional methylation data are used to compare seagull to the established R package SGL. The results of both packages enabled a precise prediction of biological age from DNA methylation status. But even though the results of seagull and SGL were very similar (R2 > 0.99), seagull computed the solution in a fraction of the time needed by SGL. Additionally, seagull enables the incorporation of weights for each penalized feature. Conclusions The following operators for linear regression models are available in seagull: lasso, group lasso, sparse-group lasso and Integrative LASSO with Penalty Factors (IPF-lasso). Thus, seagull is a convenient envelope of lasso variants.
Title: Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
Description:
Abstract Background Statistical analyses of biological problems in life sciences often lead to high-dimensional linear models.
To solve the corresponding system of equations, penalization approaches are often the methods of choice.
They are especially useful in case of multicollinearity, which appears if the number of explanatory variables exceeds the number of observations or for some biological reason.
Then, the model goodness of fit is penalized by some suitable function of interest.
Prominent examples are the lasso, group lasso and sparse-group lasso.
Here, we offer a fast and numerically cheap implementation of these operators via proximal gradient descent.
The grid search for the penalty parameter is realized by warm starts.
The step size between consecutive iterations is determined with backtracking line search.
Finally, seagull -the R package presented here- produces complete regularization paths.
Results Publicly available high-dimensional methylation data are used to compare seagull to the established R package SGL.
The results of both packages enabled a precise prediction of biological age from DNA methylation status.
But even though the results of seagull and SGL were very similar (R2 > 0.
99), seagull computed the solution in a fraction of the time needed by SGL.
Additionally, seagull enables the incorporation of weights for each penalized feature.
Conclusions The following operators for linear regression models are available in seagull: lasso, group lasso, sparse-group lasso and Integrative LASSO with Penalty Factors (IPF-lasso).
Thus, seagull is a convenient envelope of lasso variants.

Related Results

Joint sparse optimization: lower-order regularization method and application in cell fate conversion
Joint sparse optimization: lower-order regularization method and application in cell fate conversion
Abstract Multiple measurement signals are commonly collected in practical applications, and joint sparse optimization adopts the synchronous effect within multiple m...
seagull: lasso, group lasso and sparse-group lasso regularisation for linear regression models via proximal gradient descent
seagull: lasso, group lasso and sparse-group lasso regularisation for linear regression models via proximal gradient descent
SummaryStatistical analyses of biological problems in life sciences often lead to high-dimensional linear models. To solve the corresponding system of equations, penalisation appro...
Analytical Solutions to Minimum-Norm Problems
Analytical Solutions to Minimum-Norm Problems
For G∈Rm×n and g∈Rm, the minimization min∥Gψ−g∥2, with ψ∈Rn, is known as the Tykhonov regularization. We transport the Tykhonov regularization to an infinite-dimensional setting, t...
A multiple-parameter regularization approach for filtering monthly GRACE/GRACE-FO gravity models
A multiple-parameter regularization approach for filtering monthly GRACE/GRACE-FO gravity models
The Gravity Recovery and Climate Experiment (GRACE) and its subsequent GRACE Follow-On (GRACE-FO) missions have been instrumental in monitoring Earth’s mass changes throu...
Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
Kernelized elastic net regularization (KENReg) is a kernelization of the well-known elastic net regularization (Zou & Hastie, 2005 ). The kernel in KENReg is not required to be...
Regularization of classical optimality conditions in optimization problems for linear Volterra-type systems with functional constraints
Regularization of classical optimality conditions in optimization problems for linear Volterra-type systems with functional constraints
We consider the regularization of classical optimality conditions (COCs) — the Lagrange principle (LP) and the Pontryagin maximum principle (PMP) — in a convex optimal control prob...

Back to Top