Shrinkage (statistics)
In statistics, shrinkage has two meanings:
- In relation to the general observation that, in regression analysis, a fitted relationship appears to perform less well on a new data set than on the data set used for fitting.[1] In particular the value of the coefficient of determination 'shrinks'. This idea is complementary to overfitting and, separately, to the standard adjustment made in the coefficient of determination to compensate for the subjunctive effects of further sampling, like controlling for the potential of new explanatory terms improving the model by chance: that is, the adjustment formula itself provides "shrinkage." But the adjustment formula yields an artificial shrinkage, in contrast to the first definition.
- To describe general types of estimators, or the effects of some types of estimation, whereby a naive or raw estimate is improved by combining it with other information (see shrinkage estimator). The term relates to the notion that the improved estimate is at a reduced distance from the value supplied by the 'other information' than is the raw estimate. In this sense, shrinkage is used to regularize ill-posed inference problems.
A common idea underlying both of these meanings is the reduction in the effects of sampling variation.
References
- ↑ Everitt B.S. (2002) Cambridge Dictionary of Statistics (2nd Edition), CUP. ISBN 0-521-81099-X
This article is issued from Wikipedia - version of the Thursday, June 27, 2013. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.