- Variance inflation factor
In

statistics , the**variance inflation factor (VIF)**is a method of detecting the severity ofmulticollinearity . More precisely, the VIF is an index which measures how much thevariance of a coefficient (square of thestandard deviation ) is increased because of collinearity. Considering the following regression equation with k independent variables: Y = β

_{0}+ β_{1}"X"_{1}+ β_{2}"X"_{2}+ ... + β_{"k"}"X"_{"k"}+ εVIF can be calculated in three steps:

**Step one**One can calculate "k" different VIFs, one for each "X"

_{"i"}by first running an ordinary least square regression that has "X"_{"i"}as a function of all the other explanatory variables in the first equation.

If "i" = 1, for example, the equation would be"X"_{1}= α_{2}"X"_{2}+ α_{3}"X"_{3}+ ... + α_{"k"}"X"_{"k"}+"c"_{0}+ "e"where "c"

_{0}is a constant and "e" is the error term.**Step two**Then one can calculate the VIF factor for $hateta\_i$ with the following formula: $mathrm\{VIF\}(hat\{eta\_i\})=\; frac\{1\}\{1-R^2\_i\}$where R²

_{i}is the coefficient of determination of the regression equation in step one.**Step three**Analyse the magnitude of

multicollinearity by considering the size of the $VIF(hat\; eta\_i)$. A common rule of thumb is that if $VIF(hat\; eta\_i)\; 5$ then multicollinearity is high. Also 10 has been proposed (see KNN book referenced below) as a cut off value.Some software calculates the tolerance which is just the reciprocal of the VIF. The choice of which formula to use is mostly a personal preference of the researcher.

**Interpretation**The square root of the variance inflation factor tells you how much larger the standard error is, compared with what it would be if that variable were uncorrelated with the other independent variables in the equation.

**Example**

If the variance inflation factor of an independent variable were 5.27 ($sqrt\{5.27\}\; =\; 2.3$) this means that the standard error for the coefficient of that independent variable is 2.3 times as large as it would be if that independent variable were uncorrelated with the other independent variables.**References**Longnecker, M.T & Ott, R.L :"A First Course in Statistical Methods", page 615. Thomson Brooks/Cole, 2004.

Studenmund, A.H: "Using Econometrics: Apractical guide",5th Edition, page 258-259. Pearson International Edition, 2006.

Hair JF, Anderson R, Tatham RL, Black WC: "Multivariate Data Analysis". Prentice Hall: Upper Saddle River, N.J. 2006.

Marquardt, D.W. 1970 "Generalized Inverses, Ridge Regression, Biased Linear Estimation, and Nonlinear Eestimation" Technometrics 12(3), 591, 605-07

Allison, P.D. "Multiple Regression: a primer", page 142. Pine Forge Press: Thousand Oaks, C.A. 1999.

Kutner, Nachtsheim, Neter, "Applied Linear Regression Models", 4th edition, McGraw-Hill Irwin, 2004.

*Wikimedia Foundation.
2010.*