To login with Google, please enable popups

or

Don’t have an account? Sign up

To signup with Google, please enable popups

or

Sign up with Google or Facebook

or

By signing up I agree to StudyBlue's

Terms of Use and Privacy Policy

Already have an account? Log in

Reminder

Edit a Copy

Study these flashcards

- United-kingdom
- University of Bristol
- Economics
- Economics 20011
- Gregory Jolivet
- Ch3: Ols In The General Case

Harry W.

• 7

cards
OLS with K≥1 Regressors

We now look for K+1 values b̂_{0}, b̂_{1}, ..., b̂_{K} that minimise SSR. These values are found by solving a system of K+1 equations, which are the first-order conditions of the minimisation problem:

0 = Σ_{i=1}^{n} (y_{i} - b̂_{0} - Σ_{k=1}^{K}b̂_{k}x_{ki}),

0 = Σ_{i=1}^{n}x_{li} (y_{i} - b̂_{0} - Σ_{k=1}^{K}b̂_{k}x_{ki}),, for and l = 1, ..., K.

The conditions can also be written as:

0 = ȳ - b̂0 - Σ_{k=1}^{K}b̂_{k}x̄_{k}

0 = x̄_{l}ȳ - b̂_{0}x̄_{l} - Σ_{k=1}^{K}b̂_{k}x̄_{l}x̄_{k}

This system is linear in the b parameters so we van solve it using matrix algebra.

Predicted Outcome

The predicted outcome:

_{}

ŷ_{i} = b̂_{0} + b̂_{1}x_{1i} + ... + b̂_{K}x_{Ki}

We can define the predicted outcome as a random variable:

Ŷ = b̂_{0} + b̂_{1}X_{1} + ... + b̂_{K}X_{K}

Rewriting the FOC differentiated w.r.t. b̂_{0} we find that:

0 = 1/n Σi=1n (yi - b̂0 - Σk=1Kb̂kxki) = 1/n Σi=1n (yi - ŷi) = 1/n Σi=1nû = ū̂

The OLS estimate is such that the predicted

Predicted Residual

The predicted residual:

_{}

û_{i} = y_{i} - ŷ_{i}

We can define the predicted residual as a random variable:

Û = Y - Ŷ

Rewriting the FOC differentiated w.r.t. b̂_{0} we find that:

0 = 1/n Σ_{i=1}^{n} (y_{i} - b̂_{0} - Σ_{k=1}^{K}b̂_{k}x_{ki}) = 1/n Σ_{i=1}^{n} (y_{i} - ŷ_{i}) = 1/n Σ_{i=1}^{n}û = ū̂

The OLS estimate is such that the predicted residuals have zero mean.

OLS Estimator: Predicted outcome and residual

Rewriting the FOCs:

0 = 1/n Σ_{i=1}^{n} (y_{i} - b̂_{0} - Σ_{k=1}^{K}b̂_{k}x_{ki}) = 1/n Σ_{i=1}^{n} (y_{i} - ŷ_{i}) = 1/n Σ_{i=1}^{n}û = ū̂

0 = 1/n Σ_{i=1}^{n} x_{li}(y_{i} - b̂_{0} - Σ_{k=1}^{K}b̂_{k}x_{ki}) = 1/n Σ_{i=1}^{n} x_{li}(y_{i} - ŷ_{i}) = 1/n Σ_{i=1}^{n}x_{li}û = x̄_{l}ū̂

The OLS estimate is such that the predicted residuals have zero mean and are not correlated with any of the regressors.

OLS as an Orthogonal Projection

We have that predicted outcomes are orthogonal to predicted residuals:

Σ_{i=1}^{n}ŷ_{i}û_{i} = Σ_{i=1}^{n}(b̂_{0} + Σ_{k=1}^{K}b̂_{k}x_{ki})û_{i} = [ b̂_{0}⋅Σ_{i=1}^{n}û_{i} ] + [ Σ_{k=1}^{K}b̂_{0}⋅Σ_{i=1}^{n}û_{i} ] = 0

Equivalently, the empirical covariance of Û and Ŷ is zero:

côv(Û,Ŷ) = côv(Û, b̂_{0} + Σ_{k=1}^{K}b̂_{k}X_{k}) = côv(Û, b̂_{0}) + Σ_{k=1}^{K}b̂_{k}côv(Û, X_{k}) = 0.

OLS: Matrix Notation

We define the following vectors and matrices:

**Y** = [ y_{1} ], **X** = [ 1 x_{11} ... x_{K1} ], & β = [ b_{0} ].

**X** contains all realisations of the K+1 regressors. Each row i∈[1,n] of **X** gives the values of all regressors for observation i. Each column k∈[2,K+1] of **X** gives all the realisations of the variable X_{k-1} across all observations.

**X**'(**Y** - **X**β) = 0 ⇔ **X**'**Y** = **X**'**X**β̂

[ ... ] [ ... ... ... ] [ b_{1} ]

[ y_{n} ] [ 1 x_{1n} ... x_{Kn} ] [ ... ]

[ b_{K} ]

We have that: SSR(β) = (**Y** - **X**β)'(**Y** - **X**β), The OLS estimate of β is the value β̂ minimising SSR(β). The first order condition reads:

Then if the matrix** X'X **is invertible (meaning **X** has full rank), we have the closed-form expression of the OLS estimator.

β̂ = (**X'X)**^{-1}**X'Y**.

Multicollinearity

Sign up for free and study better.

Anytime, anywhere.

Get started today!

Find materials for your class:

Download our app to study better. Anytime, anywhere.