ANOVA Table and Measures of Goodness of Fit

R-squared \(\bf{(R^2)}\) measures how well an estimated regression fits the data. It is also known as the coefficient of determination and can be formulated as: $$ R^2=\frac{\text{Sum of regression squares}}{\text{Sum of squares total}}=\frac{{\sum_{i=1}^{n}{(\widehat{Y_i}-\bar{Y})}}^2}{{\sum_{i=1}^{n}{(Y_i-\bar{Y})}}^2} $$ Where: \(n\) = Number of observations….

More Details
What Multiple Regression is and How It Works

Example: Multiple Regression in Investment World James Chase, an investment analyst, wants to determine the impact of inflation rates and real rates of interest on the price of the US Dollar index (USDX). Chase uses the multiple regression model below:…

More Details
A Review of Multiple Linear Regression’s Uses

Multiple linear regression describes the variation of the dependent variable by using two or more independent variables. When used properly, it can improve predictions. However, if used incorrectly, it can create spurious relationships that can undermine predictions. Typically, a multiple…

More Details
Choosing the Appropriate Time-Series Model

The following guidelines are used to determine the most appropriate model depending on the need: Understand the investment problem. This is followed by choosing the initial model. Plot the time series to check for covariance stationarity. Observe if there is…

More Details
Cointegration

Consider a time series of the inflation rate \((\text{y}_{\text{t}})\) regressed on a time series of interest rates \((\text{x}_{\text{t}})\): $$\text{y}_{\text{t}}=\text{b}_{0}+\text{b}_{1}\text{x}_{\text{t}}+\epsilon_{\text{t}}$$ In this case, we have two different time series, \(\text{y}_{\text{t}}\) and \(\text{x}_{\text{t}}\). Either one of the time series is subject to…

More Details
Autoregressive Conditional Heteroskedasticity

Heteroskedasticity is the dependence of the variance of the error term on the independent variable. We have been assuming that time series follows the homoskedasticity assumption. Homoskedasticity is the independence of the variance of the error term on the independent…

More Details
Seasonality

Seasonality is a time series feature in which data shows regular and predictable patterns that recur every year. For example, retail sales tend to peak for the Christmas season and then decline after the holidays. A seasonal lag is the…

More Details
The Unit Root Test for Nonstationary

Unit root testing involves checking whether the time series is covariance stationary. We can either form an AR model and check for autocorrelations or perform a Dickey and Fuller test. A t-test is performed to examine the statistical significance of…

More Details
Unit Roots for Time-Series Analysis

The Unit Root Problem An AR(1) series is said to be covariance stationary if the absolute value of the lag coefficient \(\text{b}_{1}\) is less than 1. If the absolute value of \(\text{b}_{1}=1\), the time series is said to have a…

More Details
Random Walk Process

A time series is said to follow a random walk process if the predicted value of the series in one period is equivalent to the value of the series in the previous period plus a random error. A simple random…

More Details