Cheat Sheet Essay

Submitted By BobbyFlinn23q
Words: 964
Pages: 4

Lecture 1: Data Transformation and Basic Statistics

- e.g. Conclusion: This z-score is outside the range of |1.96| for 5% significance level, so we reject the null hypothesis that skewness is zero/kurtosis is 3/of normality.
Tutorial Question:

Lecture 2: Measuring Dependence
-3 ways to measure dependence:
1) Joint distribution functions

2) Covariance and Correlation
 High correlation between two variables does not mean that one of the variables cause the other.
3) Regression analysis-
-OLS provides a solution to α (line intercept) & β (slope of the line) which minimises the squared distances from the data points to a fitted line.
CAPM-
-Model assumptions:

Hypothesis:
Compare to a t-distribution with T – 2 df
Confidence interval:

Lecture 3: Multiple Linear Regression Model
Three Factor Fama-French model:

R(SMB)=Returns of small market cap stock – returns of big cap stock R(HML)=Returns of high book-to-price ratio (value stocks) – returns of low book-to-price ratio (growth stocks)

Test statistic (t-ratio):

- F test: used to test more than one coefficient simultaneously. It involves estimating 2 regressions: The unrestricted regression and the restricted regression

Matrices - Matrix is a collection of numbers. In order to multiple two matrixes, the number of columns of the first matrix must be equal to the numbers of rows in the second matrix. When matrixes are multiplies together, the resulting matrix will be of size (number of rows of first matrix x number of columns in second matrix). When multiplying, multiply along the rows of the first matrix and down the columns of the second matrix. Transpose of a matrix (X’) is the matrix obtained by switching the rows and columns of a matrix. The trace of a matrix is the sum of the terms on its leading diagonal.
Lecture 4: Regression Model Diagnostics
- Homoskedasticity: equal variances Heteroskedasticity: Do not have a constant variance. The consequence of using OLS in the presence of heteroskedasticity is that: OLS still gives unbiased coefficient estimates, standard errors are incorrect and therefore our tests are incorrect.
White Test of heteroskedasticity

How to deal with heteroskedasticity? We use an estimation method which takes this into account called Generalised Least Squares (GLS), transform into logs, use heteroskedasticity consistent errors.
Autocorrelation- we may find time-series patterns in the residuals. What are the consequences of ignoring A/C? Coefficient estimated derived still unbiased, standard errors are incorrect and as such tests also
 Use Durbin-Watson (DW) for first order A/C
- Or p < 0. Values of the DW statistic close to 2 indicate that there is no A/C. If there is a negative A/C, DW should be significantly greater than 2. If there is a positive A/C, DW should be significantly less than 2.
 A/C: The Breusch-Godfrey test (Langrange Multiplier)

How to deal with A/C? if the form of the A/C is known we could use a GLS procedure, we could eliminate A/C by adding lagged dependent variables to the regression, we could use robust standard errors (Newey and West SE)
Assumption: Linearity - Ramsey’s RESET test
- Perfect multicollinearity: the explanatory variables are perfectly correlated
- Near multicollinearity: When the explanatory variables are highly correlated (but not perfectly)
What are the consequences? R-squared will be high but the individual coefficients will have large standard errors (small t-ratios), CI would be wide and regressors are insignificant.
- The easiest way to measure the extent of mutilcollinearity
Lecture 5: Time Series Models
Stationarity: The basic idea of stationarity is that the probability laws governing the process do not change with time. Weak (Covariance) Stationary: If its first and second order moments are unaffected by a change of time origin.
- 3 conditions need to be satisfied in order to be stationary: