EC3090 Econometrics Junior Sophister 20092010 - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

EC3090 Econometrics Junior Sophister 20092010

Description:

Say we have information on more variables that theory tells us ... slope coefficients which measure the ceteris paribus effect of X1 and X2 on Y, respectively ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 13
Provided by: cnew3
Category:

less

Transcript and Presenter's Notes

Title: EC3090 Econometrics Junior Sophister 20092010


1
EC3090 Econometrics Junior Sophister 2009-2010
Topic 3 The Multiple Regression Model
Reading Wooldridge, Chapter 3 Gujarati and
Porter, Chapter 7
2
Topic 3 The Multiple Regression Model
  • 1. The model with two independent variables
  • Say we have information on more variables that
    theory tells us may influence Y
  • ß0 measures the average value of Y when X1 and
    X2 are zero
  • ß1 and ß2 are the partial regression
    coefficients/slope coefficients which measure the
    ceteris paribus effect of X1 and X2 on Y,
    respectively
  • Key assumption
  • For k independent variables

3
Topic 3 The Multiple Regression Model
  • 2. OLS estimation of the multiple regression
    model
  • Simultaneously choose the values of the unknown
    parameters of the population model that minimise
    the sum of the squared residuals
  • The first order conditions are given by the k1
    equations
  • ..
  • ..
  • Note these equations can also be obtained using
    MM estimation

4
Topic 3 The Multiple Regression Model
  • 2. OLS estimation of the multiple regression
    model
  • Consider the case where k2. OLS requires that
    we minimise
  • The first order conditions are given by the 3
    equations
  • Solve simultaneously to find OLS parameter
    estimator
  • (Illustrate on board)

5
Topic 2 The Simple Regression Model
  • 2. OLS estimation of the multiple regression
    model
  • Algebraic Properties
  • 1.
  • 2.
  • 3.
  • 4. is always on the regression line

6
Topic 3 The Multiple Regression Model
  • 3. Interpreting the coefficients of the Multiple
    Regression Model
  • OLS slope coefficients depend on the
    relationship between each of the individual
    variables and Y and on the relationship between
    the Xs (illustrate)
  • In the two-variable example re-write the OLS
    estimator for ß1 as
  • where are the OLS residuals from a simple
    regression of X1 on X2.
  • Thus, gives the pure effect of X1 on Y,
    i.e., netting out the effect of X2.
  • Predicted Values
  • Residuals
  • If model under-predicts Y
  • If model over-predicts Y

7
Topic 3 The Multiple Regression Model
  • 3. Interpreting the coefficients of the Multiple
    Regression Model
  • Relationship between simple and multiple
    regression estimates.
  • where the coefficients are OLS estimates from
  • The inclusion of additional regressors will
    affect the slope estimates
  • But where
  • 1.
  • 2.

8
Topic 3 The Multiple Regression Model
  • 4. Goodness-of-Fit in the Multiple Regression
    Model
  • How well does regression line fit the
    observations?
  • As in simple regression model define
  • SSTTotal Sum of Squares
  • SSEExplained Sum of Squares
  • SSRResidual Sum of Squares
  • Recall SST SSE SSR ? SSE ? SST and SSE gt 0
  • ? 0 ? SSE/SST ? 1
  • R2 never decreases as more independent variables
    are added use adjusted R2

Includes punishment for adding more variables to
the model
9
Topic 3 The Multiple Regression Model
  • 5. Properties of OLS Estimator of Multiple
    Regression Model
  • Gauss-Markov Theorem
  • Under certain assumptions known as the
    Gauss-Markov assumptions the OLS estimator will
    be the Best Linear Unbiased Estimator
  • Linear estimator is a linear function of the
    data
  • Unbiased
  • Best estimator is most efficient estimator,
    i.e., estimator has the minimum variance of all
    linear unbiased estimators

10
Topic 3 The Multiple Regression Model
  • 5. Properties of OLS Estimator of Multiple
    Regression Model
  • Assumptions required to prove unbiasedness
  • A1 Regression model is linear in parameters
  • A2 X are non-stochastic or fixed in repeated
    sampling
  • A3 Zero conditional mean
  • A4 Sample is random
  • A5 Variability in the Xs and there is no
    perfect collinearity in the Xs
  • Assumptions required to prove efficiency
  • A6 Homoscedasticity and no autocorrelation

11
Topic 3 The Multiple Regression Model
  • 6. Estimating the variance of the OLS estimators
  • Need to know dispersion (variance) of sampling
    distribution of OLS estimator in order to show
    that it is efficient (also required for
    inference)
  • In multiple regression model
  • Depends on
  • a) s2 the error variance (reduces accuracy of
    estimates)
  • b) SSTk variation in X (increases accuracy of
    estimates)
  • c) R2k the coefficient of determination from a
    regression of Xk on all other independent
    variables (degree of multicollinearity reduces
    accuracy of estimates)
  • What about the variance of the error terms ?2?

12
Topic 3 The Multiple Regression Model
  • 7. Model specification
  • Inclusion of irrelevant variables
  • OLS estimator unbiased but with higher variance
    if Xs correlated
  • Exclusion of relevant variables
  • Omitted variable bias if variables correlated
    with variables included in the estimated model
  • True Model
  • Estimated Model
  • OLS estimator
  • Biased
  • Omitted Variable Bias
Write a Comment
User Comments (0)
About PowerShow.com