Penalized Partial Least Squares Based on B-Splines Transformations
Beschreibung
vor 18 Jahren
We propose a novel method to model nonlinear regression problems by
adapting the principle of penalization to Partial Least Squares
(PLS). Starting with a generalized additive model, we expand the
additive component of each variable in terms of a generous amount
of B-Splines basis functions. In order to prevent overfitting and
to obtain smooth functions, we estimate the regression model by
applying a penalized version of PLS. Although our motivation for
penalized PLS stems from its use for B-Splines transformed data,
the proposed approach is very general and can be applied to other
penalty terms or to other dimension reduction techniques. It turns
out that penalized PLS can be computed virtually as fast as PLS. We
prove a close connection of penalized PLS to the solutions of
preconditioned linear systems. In the case of high-dimensional
data, the new method is shown to be an attractive competitor to
other techniques for estimating generalized additive models. If the
number of predictor variables is high compared to the number of
examples, traditional techniques often suffer from overfitting. We
illustrate that penalized PLS performs well in these situations.
adapting the principle of penalization to Partial Least Squares
(PLS). Starting with a generalized additive model, we expand the
additive component of each variable in terms of a generous amount
of B-Splines basis functions. In order to prevent overfitting and
to obtain smooth functions, we estimate the regression model by
applying a penalized version of PLS. Although our motivation for
penalized PLS stems from its use for B-Splines transformed data,
the proposed approach is very general and can be applied to other
penalty terms or to other dimension reduction techniques. It turns
out that penalized PLS can be computed virtually as fast as PLS. We
prove a close connection of penalized PLS to the solutions of
preconditioned linear systems. In the case of high-dimensional
data, the new method is shown to be an attractive competitor to
other techniques for estimating generalized additive models. If the
number of predictor variables is high compared to the number of
examples, traditional techniques often suffer from overfitting. We
illustrate that penalized PLS performs well in these situations.
Weitere Episoden
vor 11 Jahren
In Podcasts werben
Kommentare (0)