# The Least Squares Estimator
# Finite Sample Properties
# Unbiased Estimation
The interpretation of this result is that for any particular set of observations
# Omission of Relevant Variables
The estimation is biased when a relevant variable is omitted in the regression:
where
# Inclusion of Irrelevant Variables
Inclusion of irrelevant variables will not affect the biasness of the relevant variables. However, it has the cost of have a higher covariance matrix, which decrease the efficiency of the estimation.
# Variance of the Least Squares Estimator
Since
Theorem (Gauss Markov Theorem). In the linear regression model with regressor matrix
# Estimating the Variance of the Least Squares Estimator
We don’t use
but use
If we assume the disturbances are normally distributed, then the estimator
# Large Sample Properties
# Consistency of the Estimator
Assume
where
# Consistency and Unbiasedness
Consider a sample
- Unbiased but not consistent.
is an unbiased estimator of since . But, is not consistent since its distribution does not become more concentrated around as the sample size increases, it’s always - Consistent but not unbiased.
. Since , is an biased estimator. When , , so is a consistent estimator.
# Asymptotic Normality of the Estimator
Theorem (Asymptotic Distribution of
Grenander Conditions
- For each column of
, , if , then . for all . - Let
be the sample correlation matrix of the columns of , excluding the constant term if there is one. Then , a positive definite matrix.
# Interval Estimation
The ratio
where
# Prediction
The prediction variance is
and the prediction interval is