Effects of Accounting for Stock Grants ...
A company might grant stocks to employees either outright, with restrictions, or contingent... Read More
Multicollinearity occurs when two or more independent variables are significantly correlated to each other.
It results from the violation of the multiple regression assumptions that there is no apparent linear relationship between two or more independent variables. Multicollinearity is common with financial data.
Multicollinearity does not alter the consistency of the regression estimates. However, it renders them imprecise and unreliable. Multicollinearity makes it nearly impossible to determine how the independent variables influence the dependent variable individually. This inflates the standard errors for the regression coefficients, increasing the possibility of Type II errors.
A high value of \(R^2\) and a significant F-statistic that contradicts the t-test signals multicollinearity. The insignificant t-statistic implies that the standard errors are overestimated. In addition, a high correlation between independent variables indicates multicollinearity. Notably, a low correlation between independent variables does not imply the absence of multicollinearity.
There are a few methods of correcting multicollinearity:
Question
The regression problem that will most likely increase the chances of making Type II errors is:
- multicollinearity.
- conditional heteroskedasticity.
- positive serial correlation.
Solution
The correct answer is A.
Multicollinearity makes the standard errors of the slope coefficients to be artificially inflated. This increases the likelihood of incorrectly concluding that a variable is not statistically significant (Type II error).
B is incorrect. Conditional heteroskedasticity underestimates standard errors, while the coefficient estimates remain unaffected. This inflates the t-statistics, leading to the frequent rejection of the null hypothesis of no statistical significance (Type I error).
C is incorrect. Positive serial correlation makes the ordinary least squares standard errors for the regression coefficients to underestimate the true standard errors. This inflates the estimated t-statistics, making them appear to be more significant than they really are. This increases Type I error.