Page 28 - FINAL CFA II SLIDES JUNE 2019 DAY 3
P. 28
LOS 8.l: Describe multicollinearity and explain READING 8: MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS
its causes and effects in regression analysis.
MODULE 8.8: MULTICOLLINEARITY
Multicollinearity : condition in which 2+ of the independent variables are highly correlated with each other.
Effect of Multicollinearity on Regression Analysis
Doesn’t affect consistency of slope coefficients but in any event, such coefficients themselves tend to be unreliable.
Artificially inflates the standard errors of the slope coefficients.
Greater probability of incorrectly concluding that a variable is not statistically significant (i.e., a Type II Error).
Multicollinearity is likely present in most economic models. The issue is whether it has a significant effect on the regression results.
Detecting Multicollinearity
1. Check if t-tests says Fail to Reject Ho; whilst F says Reject Ho whilst the R is high.
2
(This is telling us the independent variables may have a common source of variation which is explaining the dependent variable, but the high degree
of correlation also “washes out” the individual effects (hence contradictory t and f test results).
2. Check if | r | between any two independent variables > 0.7 (Only works if ONLY 2 independent variables. If more than two, while individual
variables may not be highly correlated, linear combinations might be, leading to multicollinearity.
High correlation among the independent variables suggests the possibility of multicollinearity, but low correlation
among the independent variables does not necessarily indicate multicollinearity is not present.