Chapter 5 Estimating Demand Functions - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Chapter 5 Estimating Demand Functions

Description:

Material is on Chapter 5 of the textbook, pages 153-186, and the handout on the ... problem with this method is that frequently answers are not very accurate. ... – PowerPoint PPT presentation

Number of Views:511
Avg rating:3.0/5.0
Slides: 31
Provided by: leehrad
Category:

less

Transcript and Presenter's Notes

Title: Chapter 5 Estimating Demand Functions


1
Chapter 5Estimating Demand Functions
2
Objectives
  • The identification problem
  • Consumer interviews
  • Market experiments
  • Regression analysis

3
Objectives
  • Simple regression analysis
  • Population and sample regression line
  • Coefficient of determination (R-squared)
  • Multiple regression analysis
  • Excel regression package (handout)
  • Interpreting the computer printout

4
Reading Material
  • Material is on Chapter 5 of the textbook, pages
    153-186, and the handout on the mechanics of
    regression analysis using Excel.
  • You are not responsible for the following
    material from Chapter 5
  • Method of least squares, pages 165-167.
  • Software packages, pages 175-176.

5
The Identification Problem
  • This problem refers to situations where the data
    contain insufficient information to provide the
    required answer.
  • The inability to distinguish between moves along
    a demand curve and shifts in supply and/or demand
    that reflect changes in behavior.
  • Consider the straight forward approach to
    estimating a demand curve
  • Plot the quantity of a product sold and its
    price for (say) three years (2004, 2005, 2006).

6
The Identification Problem
7
Underestimating the Price Elasticity of Demand
Can Create Problems
60
D 06
20
8
Dealing with the Identification Problem
  • Because we are not controlling for changes in
    parameters that shift the demand curve, we can
    not be sure that the demand curve remained the
    same during the sample period.
  • If the demand curve were fixed, changes in the
    supply curve would trace it correctly.

9
Alternative Methods of Estimating Demand Functions
  • Consumer interviews.
  • Market experiments.
  • Regression analysis.

10
Consumer Interviews
  • Firms frequently engage in consumer interviews
    and surveys concerning their buying habits,
    motives and intentions.
  • One potential problem with this method is that
    frequently answers are not very accurate.
  • Clever approaches can be used to remedy this
    situation.
  • Despite the limitations of consumer interviews
    and surveys, many firms use these techniques
    often.

11
Market Experiments
  • The idea of a market experiment is to vary the
    price of a product while keeping everything else
    the same.
  • Controlled laboratory experiments can sometimes
    be carried out.
  • Direct experimentation can be expensive or/and
    risky.
  • Profits might be reduced permanently or
    temporarily.
  • Customers might switch to competitors permanently
    if they face a price increase.

12
Regression Analysis
  • Suppose that a products demand function is given
    by the following equation
  • Y A B1X B2P B3I B4Pr, where
  • Y quantity demanded
  • X advertising expenses
  • I consumers disposable income
  • Pr prices of competing (rival) goods
  • Regression analysis is a statistical technique
    that provides numerical values for the
    coefficients A, B1, B2, B3, and B4.

13
Regression Analysis
  • The numerical values of the coefficients are
    extracted from historical data concerning the
    variables Y, X, P, I, and Pr.
  • Suppose we have the following data (observations)
    on Miller Pharmaceutical Company
  • Selling expense (millions) Sales (millions)
  • 1 4
  • 2 6
  • 4 8
  • 8 14
  • 6 12
  • 5 10
  • 8 16
  • 9 16
  • 7 12

14
Simple Regression Analysis
  • Assumes that the mean value of the dependent
    variable is a linear function of the independent
    variable
  • Yi A BXi ei, where
  • Yi the ith observed value of the dependent
    variable Y
  • Xi the ith value of the independent variable X
  • ei the error term that is added to (or
    subtracted from) the expression A B Xi to
    obtain the dependent variable Yi.

15
The Mean Value of X Falls on the Population
Regression Line
16
Sample Regression Line
  • The general expression for the sample regression
    line is
  • Yi a bXi
  • where,
  • Yi the value of the dependent variable
  • predicted by the regression line.
  • a the y-intercept of the regression line
  • b the slope of the regression line.
  • Xi the value of the independent variable
  • evaluated at the ith observation.

17
Sample Regression Line
  • The population regression line is based on the
    entire population sample, whereas the sample
    regression line is based only on the sample.
  • The regression equation for the data concerning
    Miller Pharmaceutical Company data on slide 13 is
  • Y-hat 2.533 1.504X
  • Y-hat sales in millions of dollars
  • X selling expenses in millions of dollars
  • 2.533 value of a, the estimator of A
  • 1.504 value of b, the estimator of B

18
Coefficient of Determination
  • Commonly called the R-square.It measures
    goodness of fit of the estimated regression line
  • It varies between 0 and 1. The closer R2 is to 1
    the better the fit.
  • For example, if R2 0.80, then Xs
    variationexplains about 80 percent of Ys
    variation.

19
Multiple Regression Analysis
  • Suppose that in the case of Miller
    Pharmaceutical Company sales depend on its price
    and selling expenses
  • Yi a b1Xi b2Pi ei
  • where Yi sales
  • xi selling expense
  • Pi price
  • The goal of multiple regression analysis is to
    estimate the coefficients a, b1, and b2 .

20
Sales, Selling Expense and Price, Miller
Pharmaceutical Company
  • Selling expense Sales Price
  • (millions) (millions) (Less 10)
  • 1 4 1
  • 2 6 0
  • 4 8 5
  • 8 14 8
  • 6 12 4
  • 5 10 3
  • 8 16 2
  • 9 16 7
  • 7 12 6
  • The computer output generates the estimated
    regression equation Yi 2.53 1.76Xi - 0.35Pi.

21
Standard error of estimate or mean square error
  • A measure of the amount of scatter of individual
    observations about the regression line.
  • Useful for constructing prediction intervals.
  • Y-hat /- 2 standard errors.
  • In the case of Miller Pharmaceutical, the MSE
    0.37 million units of sales, based on the
    multiple regression.
  • For example, if the predicted value of Miller
    Pharmaceutical Companys sales is 11 million
    units, with probability 95 percent the firms
    sales will be between
  • 10.26 11 - 2(0.37) and 11.74 11
    2(0.37).

22
F-Statistic
  • Tells us whether or not the group of independent
    variables explains a statistically significant
    portion of the variation in the dependent
    variable.
  • Large values of the F-statistic indicate that at
    least one of the independent variables is helping
    to explain variation in the dependent variable.
  • Tables of the F-distribution are used to
    determine the probability that an observed value
    of the F-statistic could have arisen by chance,
    if non of the independent variables has any
    effect on the dependent variable.

23
t-Statistic
  • Tells us whether or not each particular
    independent variable explains a statistically
    significant portion of the variation in the
    dependent variable
  • All else equal, larger values for the t-statistic
    are better
  • Rule of Thumb as long as the t-statistic is
    greater than 2, the independent variable belongs
    to the regression equation.
  • In the case of Miller Pharmaceuticals the
    t-statistic for selling expenses is about 25.35,
    which means that this variable is highly
    significant.

24
Multicollinearity
  • This is a situation in which two or more
    independent variables are very highly correlated.
  • In this case the t statistics are meaningless.
  • We can estimate the effects of the correlated
    variables as a group on the dependent variable,
    but not the effects of each of the independent
    variables separately.
  • To cope with this situation one might have to
    alter the independent variables, or use variables
    that are not highly correlated to each other.
  • Sometimes one might have to collect new data to
    deal with the problem of multicollinearity.

25
Serial Correlation of Error Terms
26
Serial Correlation
  • In this case the error terms are not independent
    from each other.
  • This problem usually arises in time series
    samples.
  • The Durbin-Watson test is used to detect whether
    or not serial correlation is present in the error
    terms in a regression.
  • The rule of thump is that in the absence of
    serial correlation the Durbin-Watson statistic d
    must be close to the value of 2.
  • One way to deal with the problem of serial
    correlation is to take first differences of all
    the independent and dependent variables in the
    regression.

27
Further Analysis of the Residuals
28
Further Analysis of the Residuals
  • In this case, the residuals (estimated errors) of
    a regression indicate that the relationship
    between the dependent and independent variables
    is not linear.
  • One might want to try a quadratic functional form
    instead of a linear one
  • The textbook describes another case where the
    variation of the residuals varies with the level
    of the independent variable.

29
Summary
  • The identification problem refers to situations
    where there is no sufficient information to
    estimate the true parameters of an equation.
  • Regression analysis is useful in estimating
    demand functions and other economic relations
    from available data.
  • A simple regression includes only one independent
    variable, whereas a multiple regression includes
    more than one independent variables.
  • In a simple regression, the coefficient of
    determination is used to measure the closeness of
    a fit in a regression line.

30
Summary
  • In a multiple regression analysis, the multiple
    coefficient of determination, R2, measures the
    goodness of a fit.
  • The F statistic can be used to test whether any
    of the independent variables has an effect on the
    dependent variable.
  • The standard error of the estimate can be used to
    construct prediction intervals for the dependent
    variable.
  • The t-statistic for a regression coefficient of
    each independent variable can be used to test
    whether this independent variable has any effect
    on the dependent variable.
  • Some times we encounter problems or
    multicollinearity, serial correlation and
    residual patterns that needed to be treated
    appropriately.
Write a Comment
User Comments (0)
About PowerShow.com