Javascript must be enabled to continue!
seagull: lasso, group lasso and sparse-group lasso regularisation for linear regression models via proximal gradient descent
View through CrossRef
SummaryStatistical analyses of biological problems in life sciences often lead to high-dimensional linear models. To solve the corresponding system of equations, penalisation approaches are often the methods of choice. They are especially useful in case of multicollinearity which appears if the number of explanatory variables exceeds the number of observations or for some biological reason. Then, the model goodness of fit is penalised by some suitable function of interest. Prominent examples are the lasso, group lasso and sparse-group lasso. Here, we offer a fast and numerically cheap implementation of these operators via proximal gradient descent. The grid search for the penalty parameter is realised by warm starts. The step size between consecutive iterations is determined with backtracking line search. Finally, the package produces complete regularisation paths.Availability and implementationseagullis an R package that is freely available on the Comprehensive R Archive Network (CRAN;https://CRAN.R-project.org/package=seagull; vignette included). The source code is available onhttps://github.com/jklosa/seagull.Contactwittenburg@fbn-dummerstorf.de
Cold Spring Harbor Laboratory
Title: seagull: lasso, group lasso and sparse-group lasso regularisation for linear regression models via proximal gradient descent
Description:
SummaryStatistical analyses of biological problems in life sciences often lead to high-dimensional linear models.
To solve the corresponding system of equations, penalisation approaches are often the methods of choice.
They are especially useful in case of multicollinearity which appears if the number of explanatory variables exceeds the number of observations or for some biological reason.
Then, the model goodness of fit is penalised by some suitable function of interest.
Prominent examples are the lasso, group lasso and sparse-group lasso.
Here, we offer a fast and numerically cheap implementation of these operators via proximal gradient descent.
The grid search for the penalty parameter is realised by warm starts.
The step size between consecutive iterations is determined with backtracking line search.
Finally, the package produces complete regularisation paths.
Availability and implementationseagullis an R package that is freely available on the Comprehensive R Archive Network (CRAN;https://CRAN.
R-project.
org/package=seagull; vignette included).
The source code is available onhttps://github.
com/jklosa/seagull.
Contactwittenburg@fbn-dummerstorf.
de.
Related Results
Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
Abstract
Background
Statistical analyses of biological problems in life sciences often lead to high-dimensional linear models. To solve the corresponding system of equations, penal...
Joint sparse optimization: lower-order regularization method and application in cell fate conversion
Joint sparse optimization: lower-order regularization method and application in cell fate conversion
Abstract
Multiple measurement signals are commonly collected in practical applications, and joint sparse optimization adopts the synchronous effect within multiple m...
การเปรียบเทียบประสิทธิภาพของวิธีการสร้างช่วงความเชื่อมั่นสำหรับสัมประสิทธิ์การถดถอยลอจิสติกในข้อมูลที่มีมิติสูง โดยใช้การประมาณสองขั้นตอนด้วยวิธี lasso + MLE and a bootstrap lasso + partial ridge
การเปรียบเทียบประสิทธิภาพของวิธีการสร้างช่วงความเชื่อมั่นสำหรับสัมประสิทธิ์การถดถอยลอจิสติกในข้อมูลที่มีมิติสูง โดยใช้การประมาณสองขั้นตอนด้วยวิธี lasso + MLE and a bootstrap lasso + partial ridge
งานวิจัยนี้มีวัตถุประสงค์เพื่อเปรียบเทียบวิธีการสร้างช่วงความเชื่อมั่นสำหรับสัมประสิทธิ์การถดถอยลอจิสติกในข้อมูลที่มีมิติสูง โดยใช้การประมาณสองขั้นตอนด้วยวิธี Lasso+MLE และวิธี Las...
PROXIMAL HUMERUS FRACTURES, ANATOMY, EPIDEMIOLOGY, MECHANISMS OF ACTION, CLASSIFICATION, CLINICAL PRESENTATION, IMAGING PRESENTATION, DIFFERENTIAL DIAGNOSIS, TREATMENT AND COMPLICATIONS
PROXIMAL HUMERUS FRACTURES, ANATOMY, EPIDEMIOLOGY, MECHANISMS OF ACTION, CLASSIFICATION, CLINICAL PRESENTATION, IMAGING PRESENTATION, DIFFERENTIAL DIAGNOSIS, TREATMENT AND COMPLICATIONS
Introduction: Proximal humerus fractures (PHF) make up 5 to 6% of all fractures presented in adults. Approximately 67 to 85% of proximal humerus fractures are managed non-surgicall...
Regularisation
Regularisation
Ch. 4 considers the regularisation of some basic divergent series. The first is the
geometric series, which is shown to be conditionally convergent outside the unit circle of absol...
Sparse Unmixing of Hyperspectral Data with Noise Level Estimation
Sparse Unmixing of Hyperspectral Data with Noise Level Estimation
Recently, sparse unmixing has received particular attention in the analysis of hyperspectral images (HSIs). However, traditional sparse unmixing ignores the different noise levels ...
LASSO Homotopy-Based Sparse Representation Classification for fNIRS-BCI
LASSO Homotopy-Based Sparse Representation Classification for fNIRS-BCI
Brain-computer interface (BCI) systems based on functional near-infrared spectroscopy (fNIRS) have been used as a way of facilitating communication between the brain and peripheral...
Mellin-Barnes Regularisation
Mellin-Barnes Regularisation
Ch. 7 presents the theory behind an alternative method of regularising a divergent
series known as Mellin-Barnes (MB) regularisation. As a result, the regularised values for more
g...

