Explain Multicollinearity and How It Affects Regression Analysis

Explain Multicollinearity and How It Affects Regression Analysis

Multicollinearity occurs when two or more independent variables are significantly correlated to each other.

It results from the violation of the multiple regression assumptions that there is no apparent linear relationship between two or more independent variables. Multicollinearity is common with financial data.

Effects of Multicollinearity

Multicollinearity does not alter the consistency of the regression estimates. However, it renders them imprecise and unreliable. Multicollinearity makes it nearly impossible to determine how the independent variables influence the dependent variable individually. This inflates the standard errors for the regression coefficients, increasing the possibility of Type II errors.

Detecting Multicollinearity

A high value of \(R^2\) and a significant F-statistic that contradicts the t-test signals multicollinearity. The insignificant t-statistic implies that the standard errors are overestimated. In addition, a high correlation between independent variables indicates multicollinearity. Notably, a low correlation between independent variables does not imply the absence of multicollinearity.

Correcting Multicollinearity

There are a few methods of correcting multicollinearity:

  1. Reducing the number of predictor variables in the model by excluding some of them or combining two or more correlated predictors into one. This is often done by conducting feature selection techniques, such as forward selection, backward selection and stepwise regression.
  2. Regularization methods such as ridge regression and lasso regression. These reduce the magnitude of coefficients for predictors that are highly correlated with each other, thus penalizing large coefficient values associated with these predictors and reducing their influence on the model.
  3. Decorrelation methods which involve transforming the data so that predictors become uncorrelated. One popular way of doing this is Principal Component Analysis (PCA), where orthogonal components are derived from the original set of variables, with each component having uncorrelated predictors in it.
  4. Collinearity diagnostics can also be used to identify pairs of highly correlated variables and then take appropriate action to reduce multicollinearity in those cases. This includes examining Spearman’s correlation coefficients and Variance Inflation Factors (VIF).


The regression problem that will most likely increase the chances of making Type II errors is:

  1. multicollinearity.
  2. conditional heteroskedasticity.
  3. positive serial correlation.


The correct answer is A.

Multicollinearity makes the standard errors of the slope coefficients to be artificially inflated. This increases the likelihood of incorrectly concluding that a variable is not statistically significant (Type II error).

B is incorrect. Conditional heteroskedasticity underestimates standard errors, while the coefficient estimates remain unaffected. This inflates the t-statistics, leading to the frequent rejection of the null hypothesis of no statistical significance (Type I error).

C is incorrect. Positive serial correlation makes the ordinary least squares standard errors for the regression coefficients to underestimate the true standard errors. This inflates the estimated t-statistics, making them appear to be more significant than they really are. This increases Type I error.

Shop CFA® Exam Prep

Offered by AnalystPrep

Featured Shop FRM® Exam Prep Learn with Us

    Subscribe to our newsletter and keep up with the latest and greatest tips for success
    Shop Actuarial Exams Prep Shop Graduate Admission Exam Prep

    Daniel Glyn
    Daniel Glyn
    I have finished my FRM1 thanks to AnalystPrep. And now using AnalystPrep for my FRM2 preparation. Professor Forjan is brilliant. He gives such good explanations and analogies. And more than anything makes learning fun. A big thank you to Analystprep and Professor Forjan. 5 stars all the way!
    michael walshe
    michael walshe
    Professor James' videos are excellent for understanding the underlying theories behind financial engineering / financial analysis. The AnalystPrep videos were better than any of the others that I searched through on YouTube for providing a clear explanation of some concepts, such as Portfolio theory, CAPM, and Arbitrage Pricing theory. Watching these cleared up many of the unclarities I had in my head. Highly recommended.
    Nyka Smith
    Nyka Smith
    Every concept is very well explained by Nilay Arun. kudos to you man!
    Badr Moubile
    Badr Moubile
    Very helpfull!
    Agustin Olcese
    Agustin Olcese
    Excellent explantions, very clear!
    Jaak Jay
    Jaak Jay
    Awesome content, kudos to Prof.James Frojan
    sindhushree reddy
    sindhushree reddy
    Crisp and short ppt of Frm chapters and great explanation with examples.