Linear Regression - PowerPoint PPT Presentation

About This Presentation
Title:

Linear Regression

Description:

Title: Chapter 7: Author: Jessica Kohlschmidt Last modified by: leet Created Date: 11/14/2004 10:28:21 PM Document presentation format: On-screen Show (4:3) – PowerPoint PPT presentation

Number of Views:76
Avg rating:3.0/5.0
Slides: 31
Provided by: Jessi149
Learn more at: https://web.eng.fiu.edu
Category:

less

Transcript and Presenter's Notes

Title: Linear Regression


1
Chapter 12
  • Linear Regression

2
Introduction
  • Regression analysis and Analysis of variance are
    the two most widely used statistical procedures.
  • Regression analysis
  • Description
  • Prediction
  • Estimation

3
12.1 Simple Linear Regression
  •  

(12.1)
(12.2)
4
12.1 Simple Linear Regression
  •  

5
Table 12.1 Quality Improvement Data
Month Time Devoted to Quality Impr. of Non-conforming
January 56 20
February 58 19
March 55 20
April 62 16
May 63 15
June 68 14
July 66 15
August 68 13
September 70 10
October 67 13
November 72 9
December 64 8
6
Figure 12.1 Scatter Plot
7
Figure 12.1a Scatter Plot
8
12.1 Simple Linear Regression
  •  

9
12.1 Simple Linear Regression
  • The regression equation is
  • Y 55.9 - 0.641 X
  • Predictor Coef SE Coef T P
  • Constant 55.923 2.824 19.80 0.000
  • X -0.64067 0.04332 -14.79 0.000
  • S 0.888854 R-Sq 95.6 R-Sq(adj) 95.2
  • Analysis of Variance
  • Source DF SS MS F P
  • Regression 1 172.77 172.77 218.67 0.000
  • Residual Error 10 7.90 0.79
  • Total 11 180.67

10
12.1 Simple Linear Regression
  •  

11
12.2 Worth of the Prediction Equation
Obs X Y Fit SE Fit Residual St Resid
1 56.0 20.000 20.046 0.464 -0.046 -0.06
2 58.0 19.000 18.765 0.395 0.235 0.30
3 55.0 20.000 20.687 0.500 -0.687 -0.93
4 62.0 16.000 16.202 0.286 -0.202 -0.24
5 63.0 15.000 15.561 0.270 -0.561 -0.66
6 68.0 14.000 12.358 0.289 1.642 1.95
7 66.0 15.000 13.639 0.261 1.361 1.60
8 68.0 13.000 12.358 0.289 0.642 0.76
9 70.0 10.000 11.077 0.338 -1.077 -1.31
10 67.0 13.000 12.999 0.272 0.001 0.00
11 72.0 9.000 9.795 0.400 -0.795 -1.00
12 74.0 8.000 8.514 0.470 -0.514 -0.68
12
12.2 Worth of the Prediction Equation
  •  

(12.4)
13
12.3 Assumptions
  •  

 
(12.1)
14
12.4 Checking Assumptions through Residual Plots
  •  

15
12.4 Checking Assumptions through Residual Plots
16
12.5 Confidence Intervals
  •  

17
12.5 Hypothesis Test
  •  

18
12.6 Prediction Interval for Y
  •  

19
12.6 Prediction Interval for Y
20
12.7 Regression Control Chart
  •  

(12.5)
(12.6)
21
12.8 Cause-Selecting Control Chart
  • The general idea is to try to distinguish between
    quality problems that occur at one stage in a
    process from problems that occur at a previous
    processing step.
  • Let Y be the output from the second step and let
    X denote the output from the first step. The
    relationship between X and Y would be modeled.

22
12.9 Linear, Nonlinear, and Nonparametric Profiles
  • Profile refers to the quality of a process or
    product being characterized by a (Linear,
    Nonlinear, or Nonparametric) relationship between
    a response variable and one or more explanatory
    variables.
  • A possible way is to monitor each parameter in
    the model with a Shewhart chart.
  • The independent variables must be fixed
  • Control chart for R2

23
12.10 Inverse Regression
  • An important application of simple linear
    regression for quality improvement is in the area
    of calibration.
  • Assume two measuring tools are available One is
    quite accurate but expensive to use and the other
    is not as expensive but also not as accurate. If
    the measurements obtained from the two devices
    are highly correlated, then the measurement that
    would have been made using the expensive
    measuring device could be predicted fairly from
    the measurement using the less expensive device.
  • Let Y measurement from the less expensive
    device
  • X measurement from the accurate device

24
12.10 Inverse Regression
  •  

25
12.10 Inverse RegressionExample
  •  

Y X 2.3 2.4 2.5 2.6 2.4 2.5 2.8 2.9 2.9 3.0 2.6 2.
7 2.4 2.5 2.2 2.3 2.1 2.2 2.7 2.7
26
12.11 Multiple Linear Regression
  •  

27
12.12 Issues in Multiple Regression12.12.1
Variable Selection
  •  

28
12.12.3 Multicollinear Data
  • Problems occur when at least two of the
    regressors are related in some manner.
  • Solutions
  • Discard one or more variables causing the
    multicollinearity
  • Use ridge regression

29
12.12.4 Residual Plots
  •  

30
12.12.6 Transformations
  •  
Write a Comment
User Comments (0)
About PowerShow.com