Variance inflation factor

In statistics, the variance inflation factor (VIF) quantifies the severity of multicollinearity in an ordinary least squares regression analysis. It provides an index that measures how much the variance (the square of the estimate's standard deviation) of an estimated regression coefficient is increased because of collinearity.

Definition

Consider the following linear model with k independent variables:

Y = β0 + β1 X1 + β2 X 2 + ... + βk Xk + ε.

The standard error of the estimate of βj is the square root of the j+1, j+1 element of s2(XX)1, where s is the root mean squared error (RMSE) (note that RMSE2 is an unbiased estimator of the true variance of the error term,  \sigma^2 ); X is the regression design matrix a matrix such that Xi, j+1 is the value of the jth independent variable for the ith case or observation, and such that Xi, 1 equals 1 for all i. It turns out that the square of this standard error, the estimated variance of the estimate of βj, can be equivalently expressed as


{\rm \widehat{var}}(\hat{\beta}_j) = \frac{s^2}{(n-1)\widehat{\rm var}(X_j)}\cdot \frac{1}{1-R_j^2},

where Rj2 is the multiple R2 for the regression of Xj on the other covariates (a regression that does not involve the response variable Y). This identity separates the influences of several distinct factors on the variance of the coefficient estimate:

The remaining term, 1 / (1  Rj2) is the VIF. It reflects all other factors that influence the uncertainty in the coefficient estimates. The VIF equals 1 when the vector Xj is orthogonal to each column of the design matrix for the regression of Xj on the other covariates. By contrast, the VIF is greater than 1 when the vector Xj is not orthogonal to all columns of the design matrix for the regression of Xj on the other covariates. Finally, note that the VIF is invariant to the scaling of the variables (that is, we could scale each variable Xj by a constant cj without changing the VIF).

Calculation and Analysis

We can calculate k different VIFs (one for each Xi) in three steps:

Step one

First we run an ordinary least square regression that has Xi as a function of all the other explanatory variables in the first equation.
If i = 1, for example, the equation would be

X_1=\alpha_2 X_2 + \alpha_3 X_3 + \cdots + \alpha_k X_k + c_0 +e

where c0 is a constant and e is the error term.

Step two

Then, calculate the VIF factor for \hat\beta_i with the following formula:

\mathrm{VIF}= \frac{1}{1-R^2_i}

where R2i is the coefficient of determination of the regression equation in step one, with  X_i on the left hand side, and all other predictor variables (all the other X variables) on the right hand side.

Step three

Analyze the magnitude of multicollinearity by considering the size of the \operatorname{VIF}(\hat \beta_i). A rule of thumb is that if \operatorname{VIF}(\hat \beta_i) > 10 then multicollinearity is high.[1]

Some software instead calculates the tolerance which is just the reciprocal of the VIF. The choice of which to use is a matter of personal preference.

Interpretation

The square root of the variance inflation factor tells you how much larger the standard error is, compared with what it would be if that variable were uncorrelated with the other predictor variables in the model.

Example
If the variance inflation factor of a predictor variable were 5.27 (√5.27 = 2.3) this means that the standard error for the coefficient of that predictor variable is 2.3 times as large as it would be if that predictor variable were uncorrelated with the other predictor variables.

References

  1. Kutner, M. H.; Nachtsheim, C. J.; Neter, J. (2004). Applied Linear Regression Models (4th ed.). McGraw-Hill Irwin.

Further reading

This article is issued from Wikipedia - version of the Tuesday, March 29, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.