Multiple Regression Model: Asymptotic Properties OLS Estimator - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Multiple Regression Model: Asymptotic Properties OLS Estimator

Description:

Under the Gauss-Markov assumptions OLS is BLUE. ... However, under the Gauss-Markov assumptions, the OLS estimators will have the ... – PowerPoint PPT presentation

Number of Views:406
Avg rating:3.0/5.0
Slides: 23
Provided by: PatriciaM47
Category:

less

Transcript and Presenter's Notes

Title: Multiple Regression Model: Asymptotic Properties OLS Estimator


1
Multiple Regression ModelAsymptotic
PropertiesOLS Estimator
  • y b0 b1 x1 b2 x2 . . . bk xk u

2
Lecture 5 THE MULTIPLE REGRESSION MODEL
ASYMPTOTIC PROPERTIES OLS ESTIMATOR Professor
Victor Aguirregabiria
  • OUTLINE
  • Convergence in Probability the Law of Large
    Numbers (LLN)
  • Consistency of OLS
  • Convergence in Distribution the Central Limit
    Theorem (CLT)
  • Asymptotic Normality of OLS
  • Asymptotic Tests
  • Asymptotic Efficiency of OLS

3
1. Convergence in Probability the Law of Large
Numbers
  • Consider the sequence of random variables
  • Z1, Z2, Zn,
  • We say that this sequence converges in
    probability to a constant c if for any small
    constant e
  • the limit as n goes to infinity of Prob( Zn -
    c gt e) is zero
  • That is, as n increases the probability
    distribution of Zn becomes closer and closer to
    the constant c.
  • In the limit, the random variable Zn is equal to
    the constant c with probability 1.

4
Law of Large Numbers (LLN)
  • Let y1, y2, yn be a random sample of a random
    variable Y, that has a finite variance.
  • Define the sample mean
  • The LLN says that the sample mean converges in
    probability to the population mean of Y.

5
Consistency
  • Under the Gauss-Markov assumptions OLS is BLUE.
  • However, when we relax some of the assumptions
    (constant variance) it wont be always possible
    to find unbiased estimators.
  • In those cases, we may still find estimators
    that are consistent, meaning as n ? 8, the
    distribution of the estimator collapses to the
    parameter value

6
Sampling Distributions as n ?
n3
n1 lt n2 lt n3
n2
n1
b1
7
2. Consistency of OLS
  • Under the Gauss-Markov assumptions, the OLS
    estimator is consistent (and unbiased)
  • Consistency can be proved for the simple
    regression case in a manner similar to the proof
    of unbiasedness
  • Will need to take probability limit (plim) to
    establish consistency

8
Proving Consistency
9
A Weaker Assumption
  • For unbiasedness, we assumed a zero conditional
    mean E(ux1, x2,,xk) 0
  • For consistency, we can have the weaker
    assumption of zero mean and zero correlation
    E(u) 0 and Cov(xj,u) 0
  • Without this assumption, OLS will be biased and
    inconsistent!

10
Deriving the Inconsistency
  • Just as we could derive the omitted variable
    bias earlier, now we want to think about the
    inconsistency, or asymptotic bias, in this case

11
Asymptotic Bias
  • So, thinking about the direction of the
    asymptotic bias is just like thinking about the
    direction of bias for an omitted variable
  • Main difference is that asymptotic bias uses the
    population variance and covariance, while bias
    uses the sample counterparts
  • Remember, inconsistency is a large sample
    problem it doesnt go away as add data

12
3. Convergence in Distribution Central Limit
Theorem
  • Recall that under the CLM assumptions, the
    sampling distributions are normal, so we could
    derive t and F distributions for testing
  • This exact normality was due to assuming the
    population error distribution was normal
  • This assumption of normal errors implied that the
    distribution of y, given the xs, was normal as
    well

13
3. Convergence in Distribution
  • Easy to come up with examples for which this
    exact normality assumption will fail.
  • Any clearly skewed variable, like wages,
    arrests, savings, etc. cant be normal, since a
    normal distribution is symmetric.
  • Normality assumption not needed to conclude OLS
    is BLUE, only for inference.

14
Central Limit Theorem
  • The central limit theorem states that the
    standardized sample mean of any sample with mean
    m and variance s2 is asymptotically N(0,1)

15
4. Asymptotic Normality of OLS
16
4. Asymptotic Normality of OLS
  • Because the t distribution approaches the normal
    distribution for large df, we can also say that
  • Note that while we no longer need to assume
    normality with a large sample, we do still need
    homoskedasticity

17
Asymptotic Standard Errors
  • If u is not normally distributed, we sometimes
    will refer to the standard error as an asymptotic
    standard error, since
  • So, we can expect standard errors to shrink at a
    rate proportional to the inverse of vn

18
5. Asymptotic Tests Lagrange Multiplier (LM)
Test
  • Once we are using large samples and relying on
    asymptotic normality for inference, we can use
    more that t and F stats
  • The Lagrange multiplier or LM statistic is an
    alternative for testing multiple exclusion
    restrictions

19
LM Test
  • Suppose we have a standard model,
  • y b0 b1x1 b2x2 . . . bkxk u
  • And our null hypothesis implies a restricted
    model

20
LM Test
  • First, we just run an OLS regression for the
    restricted model.
  • Second, we take the residuals of the restricted
    model and make the regression

21
LM Test
  • Under the null hypothesis, we have that
  • With a large sample, the result from an F test
    and from an LM test should be similar

22
6. Asymptotic Efficiency of OLS
  • Estimators besides OLS will be consistent
  • However, under the Gauss-Markov assumptions, the
    OLS estimators will have the smallest asymptotic
    variances
  • We say that OLS is asymptotically efficient
  • Important to remember our assumptions though, if
    not homoskedastic, not true
Write a Comment
User Comments (0)
About PowerShow.com