![Boosting Ridge Regression](https://cdn.podcastcms.de/images/shows/315/2444607/s/622906643/boosting-ridge-regression.png)
Boosting Ridge Regression
Beschreibung
vor 20 Jahren
Ridge regression is a well established method to shrink regression
parameters towards zero, thereby securing existence of estimates.
The present paper investigates several approaches to combining
ridge regression with boosting techniques. In the direct approach
the ridge estimator is used to fit iteratively the current
residuals yielding an alternative to the usual ridge estimator. In
partial boosting only part of the regression parameters are
reestimated within one step of the iterative procedure. The
technique allows to distinguish between variables that are always
included in the analysis and variables that are chosen only if
relevant. The resulting procedure selects variables in a similar
way as the Lasso, yielding a reduced set of influential variables.
The suggested procedures are investigated within the classical
framework of continuous response variables as well as in the case
of generalized linear models. In a simulation study boosting
procedures for different stopping criteria are investigated and the
performance in terms of prediction and the identification of
relevant variables is compared to several competitors as the Lasso
and the more recently proposed elastic net.
parameters towards zero, thereby securing existence of estimates.
The present paper investigates several approaches to combining
ridge regression with boosting techniques. In the direct approach
the ridge estimator is used to fit iteratively the current
residuals yielding an alternative to the usual ridge estimator. In
partial boosting only part of the regression parameters are
reestimated within one step of the iterative procedure. The
technique allows to distinguish between variables that are always
included in the analysis and variables that are chosen only if
relevant. The resulting procedure selects variables in a similar
way as the Lasso, yielding a reduced set of influential variables.
The suggested procedures are investigated within the classical
framework of continuous response variables as well as in the case
of generalized linear models. In a simulation study boosting
procedures for different stopping criteria are investigated and the
performance in terms of prediction and the identification of
relevant variables is compared to several competitors as the Lasso
and the more recently proposed elastic net.
Weitere Episoden
![Global permutation tests for multivariate ordinal data: alternatives, test statistics, and the null dilemma](https://cdn.podcastcms.de/images/shows/66/2444607/s/623847222/global-permutation-tests-for-multivariate-ordinal-data-alternatives-test-statistics-and-the-null-dilemma.png)
![Clustering in linear mixed models with approximate Dirichlet process mixtures using EM algorithm](https://cdn.podcastcms.de/images/shows/66/2444607/s/623847217/clustering-in-linear-mixed-models-with-approximate-dirichlet-process-mixtures-using-em-algorithm.png)
![Variable selection with Random Forests for missing data](https://cdn.podcastcms.de/images/shows/66/2444607/s/623847206/variable-selection-with-random-forests-for-missing-data.png)
vor 12 Jahren
In Podcasts werben
Kommentare (0)