Messung und statistische Analyse von Kundenzufriedenheit - PowerPoint PPT Presentation

1 / 85
About This Presentation
Title:

Messung und statistische Analyse von Kundenzufriedenheit

Description:

Title: Introduction to Structural Equation Modeling and Basic PLS Algorithms Author: STAT4 Last modified by: USER Created Date: 7/18/2001 1:59:43 PM – PowerPoint PPT presentation

Number of Views:200
Avg rating:3.0/5.0
Slides: 86
Provided by: STAT158
Category:

less

Transcript and Presenter's Notes

Title: Messung und statistische Analyse von Kundenzufriedenheit


1
Messung und statistische Analyse von
Kundenzufriedenheit
  • KF Qualitätsmanagement
  • Vertiefungskurs V

2
Outline
  • Customer satisfaction measurement
  • The Structural Equation Model (SEM)
  • Estimation of SEMs
  • Evaluation of SEMs
  • Practice of SEM-Analysis

3
The ACSI Model


Ref. http//www.theacsi.org/model.htm
4
ACSI-Model Latent Variables
  • Customer Expectations combine customers
    experiences and information about it via media,
    advertising, salespersons, and word-of-mouth from
    other customers
  • Perceived Quality overall quality, reliability,
    the extent to which a product/service meets the
    customers needs
  • Customer Satisfaction overall satisfaction,
    fulfillment of expectations, comparison with
    ideal
  • Perceived Value overall price given quality and
    overall quality given price
  • Customer Complaints percentage of respondents
    who reported a problem
  • Customer Loyalty likelihood to purchase at
    various price points

5

  Baseline Q2 1995 Q2 1996 Q2 1997 Q2 1998 Q2 1999 Q2 2000 Q2 2001 Q2 2002 Q2 2003 Q22004 Changes Changes
MANUFACTURING/DURABLES 79.2 79.8 78.8 78.4 77.9 77.3 79.4 78.7 79.0 79.2 78.3 -1.1 -1.1

Personal Computers 78 75 73 70 71 72 74 71 71 72 74 2.8 -5.1
Apple Computer, Inc. 77 75 76 70 69 72 75 73 73 77 81 5.2 5.2
Dell Inc. NM NM NM 72 74 76 80 78 76 78 79 1.3 9.7
Gateway, Inc. NM NM NM NM 76 76 78 73 72 69 74 7.2 -2.6
All Others NM 70 73 72 69 69 68 67 70 69 71 2.9 1.4
Hewlett-Packard Company HP 78 80 77 75 72 74 74 73 71 70 71 1.4 -9.0
Hewlett-Packard Company Compaq 78 77 74 67 72 71 71 69 68 68 69 1.5 -11.5
6
The European Customer Satisfaction Index (ECSI)
Ref. http//www.swics.ch/ecsi/index.html
7
ACSIe-Model for Food Retail
Emotional Factor
Hackl et al. (2000) Latent variables and path
coefficients
Perceived Quality
0,33
0,35
Custo- mer Satis- faction
0,37
0,36
0,73
(-0,01)
0,34
Loyalty
Expec- tations
0,53
Value
0,34
(0,06)
8
Austrian Food Retail Market
  • Pilot for an Austrian National CS Index (Zuba,
    1997)
  • Data collection December 1996 by Dr Fessel GfK
    (professional market research agency)
  • 839 interviews, 327 complete observations
  • Austria-wide active food retail chains (1996
    50 from the 10.5 BEUR market)
  • Billa well-assorted medium-sized outlets
  • Hofer limited range at good prices
  • Merkur large-sized supermarkets with
    comprehensive range
  • Meinl top in quality and service

9
The Data
Indicators Latent
total expected quality (EGESQ), expected compliance with demands (EANFO), expected shortcomings (EMANG) Expectations (E)
total perceived quality (OGESQ), perceived compliance with needs (OANFO), perceived shortcomings (OMANG) Perceived Quality (Q)
value for price (VAPRI), price for value (PRIVA) Value (P)
total satisfaction (CSTOT), fulfilled expectations (ERWAR), comparison with ideal (IDEAL) Customer Sa-tisfaction (CS)
number of oral complaints (NOBES), number of written complaints (NOBRI) Voice (V)
repurchase probability (WIEDE), tolerance against price-change (PRVER) Loyalty (L)
10
The Emotional Factor
  • Principal component analysis of satisfaction
    drivers
  • staff (availability, politeness)
  • outlet (make-up, presentation of merchandise,
    cleanliness)
  • range (freshness and quality, richness)
  • price-value ratio (value for price, price for
    value)
  • customer orientation (access to outlet, shopping
    hours, queuing time for checkout, paying modes,
    price information, sales, availability of sales)
  • identifies (Zuba, 1997)
  • staff, outlet, range Emotional factor
  • price-value ratio Value
  • customer orientation Cognitive factor

11
Structural Equation Models
  • Combine three concepts
  • Latent variables
  • Pearson (1904), psychometrics
  • Factor analysis model
  • Path analysis
  • Wright (1934), biometrics
  • Technique to analyze systems of relations
  • Simultaneous regression models
  • Econometrics

12
Customer Satisfaction
  • Is the result of the customers comparison of
  • his/her expectations with
  • his/her experiences
  • has consequences on
  • loyalty
  • future profits of the supplier

13
Expectation vs. Experience
  • Expectation reflects
  • customers needs
  • offer on the market
  • image of the supplier
  • etc.
  • Experiences include
  • perceived performance/quality
  • subjective assessment
  • etc.

14
CS-Model Path Diagram
Expecta- tions
Custo- mer Satis- faction
Loyalty
Perceived Quality
15
A General CS-Model
Voice
Expecta- tions
Custo- mer Satis- faction
Loyalty
Perceived Quality
Profits
16
CS-Model Structure
EX expectation PQ perceived quality CS
customer satisfaction LY loyalty
to from EX PQ CS LY
EX X X 0
PQ 0 X 0
CS 0 0 X
LY 0 0 0
Recursive structure triangular form of
relations
17
CS-Model Equations
PQ a1 g11EX z1 CS a2 b21PQ g21EX
z2 LY a3 b32CS z3
Simultaneous equations model in latent
variables Exogenous EX Endogenous PQ, CS,
LY Error terms (noises) z1, z2, z3
18
Simple Linear Regression
  • Model Y a gX z
  • Observations (xi, yi), i1,,n
  • Fitted Model Y a cX
  • OLS-estimates a, c
  • minimize the sum of squared residuals
  • sxy sample-covariance of X and Y

19
Criteria of Model Fit
  • R2 coefficient of determination
  • the squared correlation between Y and Y
    R2 ryy2
  • t-Test Test of H0 g0 against H1g?0
  • tc/s.e.(c)
  • s.e.(c) standard error of c
  • F-Test Test of H0 R20 against H1 R2?0
  • follows for large n the F-distribution with n-2
    and 2 df

20
Multiple Linear Regression
  • Model Y a X1g1 ... Xkgk z a xg
    z
  • Observations (xi1,, xik, yi), i1,,n
  • In Matrix-Notation y a Xg z
  • y, z n-vectors, g k-vector, X nxk-matrix
  • Fitted Model y a Xc
  • OLS-estimates a, c
  • R2 ryy2
  • F-Test
  • t-Test

21
Simultaneous Equations Models
  • A 2-equations model
  • PQ a1 g11EX z1
  • CS a2 b21PQ g21EX z2
  • In matrix-notation Y BY GX z
  • with

path coefficients
22
Simultaneous Equations Models
  • Model Y BY GX z
  • Y, z m-vectors,
  • B (mxm)-matrix
  • G (mxK)-matrix,
  • X K-vector
  • Problems
  • Simultaneous equation bias OLS-estimates of
    coefficients are not consistent
  • Identifiability Can coefficients be consistently
    estimated?

Some assumptions z E(z)0, Cov(z) S
Exogeneity Cov(X,z) 0
23
Path Analytic Model
PQ g11EX z1 CS b21PQ g21EX z2
EX
d1
z2
g21
CS
Var(d1) sEX2
g11
PQ
b21
z1
24
Path Analysis
  • Wright (1921, 1934)
  • A multivariate technique
  • Model Variables may be
  • structurally related
  • structurally unrelated, but correlated
  • Decomposition of covariances allows to write
    covariances as functions of structural parameters
  • Definition of direct and indirect effects

25
Example
sCS,EX g21s2EX b21sPQ,EX
g21s2EX g11b21s2EX
EX
d1
z2
g21
CS
g11
PQ
b21
with standardized variable EX
rCS,EX g21 g11b21
z1
26
Direct and Indirect Effects
  • rCS,EX g21 g11b21
  • Direct effect coefficient that links independent
    with dependent variable e.g., g21 is direct
    effect of EX on CS
  • Indirect effect effect of one variable on
    another via one or more intervening variable(s),
    e.g., g11b21
  • Total indirect effect sum of indirect effects
    between two variables
  • Total effect sum of direct and total indirect
    effects between two variables

27
Decomposition of Covariance syx
variable on path from X to Y
?YI path coefficient of variable I to Y
28
First Law of Path Analysis
  • Decomposition of covariance sxy between Y and X
  • Assumptions
  • Exogenous (X) and endogenous variables (Y) have
    mean zero
  • Errors or noises (z)
  • have mean zero and equal variances across
    observations
  • are uncorrelated across observations
  • are uncorrelated with exogenous variables
  • are uncorrelated across equations

29
Identification
  • PQ g11EX z1 Y1 g11X z1
  • CS b21PQ g21EX z2 Y2 b21Y1 g21X z2
  • In matrix-notation Y BY GX z
  • Number of parameters p6
  • Model is identified, if all parameters can be
    expressed as functions of variances/covariances
    of observed variables

30
Identification, contd
  • Y1 g11X z1
  • Y2 b21Y1 g21X z2
  • s1X g11 sX2
  • s2X b21s1X g21sX2
  • s21 b21s12 g21s1X
  • sX2 sX2
  • sy12 g11s1Xs12
  • sy22 b21s21 g21s2Xs22

p6 first 3 equations allow unique solution
for path coefficients, last three for variances
of d and z
31
Condition for Identification
  • Just-identified all parameters can be uniquely
    derived from functions of variances/covariances
  • Over-identified at least one parameter is not
    uniquely determined
  • Under-identified insufficient number of
    variances/covariances
  • Necessary, but not sufficient condition for
    identification number of variances/covariances
    at least as large as number of parameters
  • A general and operational rule for checking
    identification has not been found

32
Latent variables and Indicators
  • Latent variables (LVs) or constructs or factors
    are unobservable, but
  • We might find indicators or manifest variables
    (MVs) for the LVs that can be used as measures of
    the latent variable
  • Indicators are imperfect measures of the latent
    variable

33
Indicators for Expectation
From Swedish CSB Questionnaire, Banks Private
Customers

d1
E1
EX
d2
E2
E1, E2, E3 block of LVs for Expectation
d3
E3
E1 When you became a customer of AB-Bank, you
probably knew something about them. How would you
grade your expectations on a scale of 1 (very
low) to 10 (very high)? E2 Now think about the
different services they offer, such as bank
loans, rates, Rate your expectations on a scale
of 1 to 10? E3 Finally rate your overall
expectations on a scale of 1 to 10?
34
Notation

d1
l1
X1l1xd1 X2l2xd2 X3l3xd3
X1
x
l2
d2
X2
l3
d3
X3
reflective indicators
  • x latent variable, factor
  • Xi indicators, manifest
  • variables
  • li factor loadings
  • i measurement errors,
  • noise

Some properties LV unit variance noise di
has mean zero, variance si2, uncorrela- ted
with other noises
35
Notation

d1
l1
X1l1xd1 X2l2xd2 X3l3xd3 X Lx d
X1
x
l2
d2
X2
l3
d3
X3
In matrix-notation with vectors X, L, and
d e.g., X (X1, X2, X3)
  • x latent variable, factor
  • Xi indicators, manifest
  • variables
  • li factor loadings
  • i measurement error,
  • noise

36
CS-Model Path Diagram
d1
E1
EX
d2
z2
E2
e4
g21
d3
C1
CS
E3
e5
g11
C2
e1
e6
Q1
C3
PQ
b21
e2
Q2
e3
Q3
z1
37
SEM-Model Path Diagram
d1
X1
x
d2
z2
X2
e4
g21
d3
Y4
h2
X3
e5
g11
Y5
e1
e6
Y1
Y6
h1
b21
e2
Y2
  • Bh Gx z
  • X Lxxd, Y Lyhe

e3
Y3
z1
38
SEM-Model Notation
Inner relations, inner model
  • Bh Gx z

Outer relations, measurement model
X, d 3-component vector Y, e 6-component vector
X Lxxd, Y Lyhe
39
Statistical Assumptions
  • Error terms of inner model (z) have
  • zero means
  • constant variances across observations
  • are uncorrelated across observations
  • are uncorrelated with exogenous variables
  • Error terms of measurement models (d, e) have
  • zero means
  • constant variances across observations
  • are uncorrelated across observations
  • are uncorrelated with latent variables and with
    each other
  • Latent variables are standardized

40
Covariance Matrix of Manifest Variables
  • Unrestricted covariance matrix (order K kxky)
  • S Var(X,Y)
  • Model-implied covariance matrix

41
Estimation of the Parameters
  • Covariance fitting methods
  • search for values of parameters q so that the
    model-implied covariance matrix fits the observed
    unrestricted covariance matrix of the MVs
  • LISREL (LInear Structural RELations) Jöreskog
    (1973), Keesling (1972), Wiley (1973)
  • Software LISREL by Jöreskog Sörbom
  • PLS techniques
  • partition of q in estimable subsets of
    parameters
  • iterative optimizations provide successive
    approximations for LV scores and parameters
  • Wold (1973, 1980)

42
Discrepancy Function
  • The discrepancy or fitting function
  • F(SS) F(S S(q))
  • is a measure of the distance between the
    model-implied covariance-matrix S(q) and the
    estimated unrestricted covariance-matrix S
  • Properties of the discrepancy function
  • F(SS) 0
  • F(SS) 0 if SS

43
Covariance Fitting (LISREL)
  • Estimates of the parameters are derived by
  • F(SS(q)) q min
  • Minimization of (K number of indicators)
  • F(SS) logS logS trace (SS-1) K
  • gives ML-estimates, if the manifest variables
    are independently, multivariate normally
    distributed
  • Iterative Algorithm (Newton-Raphson type)
  • Identification
  • Choice of starting values is crucial
  • Other choices of F result in estimation methods
    like OLS and GLS ADF (asymptotically
    distribution free)

44
PLS Techniques
  • Estimates factor scores for latent variables
  • Estimates structural parameters (path
    coefficients, loading coefficients), based on
    estimated factor scores, using the principle of
    least squares
  • Maximizes the predictive accuracy
  • Predictor specification, viz. that E(hx)
    equals the systematic part of the model, implies
    E(zx)0 the error term has (conditional) mean
    zero
  • No distributional assumptions beyond those on 1st
    and 2nd order moments

45
The PLS-Algorithm
  • Step 1 Estimation of factor scores
  • Outer approximation
  • Calculation of inner weights
  • Inner approximation
  • Calculation of outer weights
  • Step 2 Estimation of path and loading
    coefficients by minimizing Var(z) and Var(d)
  • Step 3 Estimation of location parameters
    (intercepts)
  • Bo from h Bo Bh Gx z
  • Lo from X Lo Lxx d

46
Estimation of Factor Scores
  • Factor hi realizations Yin, n1,,N
  • Yin(o) outer approximation of Yin
  • Yin(i) inner approximation of Yin
  • Indicator Yij observations yijn j1,,Ji
    n1,,N
  • Outer approximation Yin(o)Sjwijyijn s.t.
    Var(Yi(o))1
  • Inner weights vihsign(rih), if hi and hh
    adjacent otherwise vih0 rihcorr(hi,hh)
    (centroid weighting)
  • Inner approximation Yin(i)ShvihYhn(o) s.t.
    Var(Yi(i))1
  • Outer weights wijcorr(Yij,Yi(i))
  • Start choose arbitrary values for wij
  • Repeat 1. through 4. until outer weights converge

47
Example
d1
E1
EX
d2
z2
E2
e4
g21()
d3
C1
CS
E3
e5
g11()
C2
e1
e6
Q1
C3
PQ
b21()
e2
Q2
e3
Q3
z1
48
Example, contd
  • Starting values wEX,1,,wEX,3,wPQ,1,,wPQ,3,wCS,1,
    ,wCS,3
  • Outer approximation
  • EXn(o) SjwEX,jEjn similar PQn(o), CSn(o)
  • standardized
  • Inner approximation
  • EXn(i) PQn(o) CSn(o)
  • PQn(i) EXn(o) CSn(o)
  • CSn(i) EXn(o) PQn(o)
  • standardized
  • Outer weights
  • wEX,j corr(Ej,EX(i)), j1,,3 similar wPQ,j,
    wCS,j

49
Choice of Inner Weights
  • Centroid weighting scheme Yin(i)ShvihYhn(o)
  • vijsign(rih), if hi and hh adjacent, vij0
    otherwise
  • with rihcorr(hi,hh) these weights are obtained
    if vih are chosen to be 1 or -1 and Var(Yi(i))
    is maximized
  • Weighting schemes

hh predecessor hh successor
centroid sign(rih) sign(rih)
factor, PC rih rih
path bih rih
bih coefficient in regression of hi on hh
50
Measurement Model Examples
  • Latent variables from Swedish CSB Model
  • Expectation
  • E1 new customer feelings
  • E2 special products/services expectations
  • E3 overall expectation
  • Perceived Quality
  • Q1 range of products/services
  • Q2 quality of service
  • Q3 clarity of information on products/services
  • Q4 opening hours and appearance of location
  • Q5 etc.

51
Measurement Models
  • Reflective model each indicator is reflecting
    the latent variable (example 1)
  • Yij lijhi eij
  • Yij is called a reflective or effect indicator
    (of hi)
  • Formative model (example 2)
  • hi py'Yi di
  • py is a vector of ki weights Yij are called
    formative or cause indicators
  • Hybrid or MIMIC model (for multiple indicators
    and multiple causes)
  • Choice between formative and reflective depends
    on the substantive theory
  • Formative models often used for exogenous,
    reflective and MIMIC models for endogenous
    variables

52
Estimation of Outer Weights
  • Mode A estimation of Yi(o) reflective
    measurement model
  • weight wij is coefficient from simple regression
    of Yi(i) on Yij wij corr(Yij,Yi(i))
  • Mode B estimation of Yi(o) formative
    measurement model
  • weight wij is coefficient of Yij from multiple
    regression of Yi(i) on Yij, j1,,Ji
  • multicollinearity?!
  • MIMIC model

53
Properties of Estimators
  • A general proof for convergence of the
    PLS-algorithm does not exists practitioners
    experience no problems
  • Factor scores are inconsistent but consistent at
    large consistency is achieved with increasing
    sample size and block size
  • Loading coefficients are inconsistent and seem to
    be overestimated
  • Path coefficients are inconsistent and seem to be
    underestimated

54
ACSI Model Results
Perceived Quality
Voice
-0,38 -0,29
0,78 0,47
Custo- mer Satis- faction
0,90 0,73
0,17 (0,06)
0,95 0,53
0,57 0,35
(-0,15) 0,12
Loyalty
Expec- tations
Value
0,40 0,35
-0,24 (0,06)
EQS-estimates PLS-estimates
55
Evaluation of SEM-Models
  • Depends on estimation method
  • Covariance-fitting methods distributional
    assumptions, optimal parameter estimates, factor
    indeterminacy
  • PLS path modeling non-parametric, optimal
    prediction accuracy, LV scores
  • Step 1 Inspection of estimation results (R2,
    parameter estimates, standard errors, LV scores,
    residuals, etc.)
  • Step 2 Assessment of fit
  • Covariance-fitting methods global measures
  • PLS path modeling partial fitting measures

56
Inspection of Results
  • Covariance-fitting methods global optimization
  • Model parameters and their standard errors do
    they confirm theory?
  • Correlation residuals sij-sij(q)
  • Graphical methods
  • PLS techniques iterative optimization of outer
    models and inner model
  • Model parameters
  • Resampling procedures like blindfolding or
    jackknifing give standard errors of model
    parameters
  • LV scores
  • Graphical methods

57
Fit Indices
  • Covariance-fitting methods covariance fit
    measures such as
  • Chi-square statistics
  • Goodness of Fit Index (GFI), AGFI
  • Normed Fit Index (NFI), NNFI, CFI
  • Etc.
  • Basis is the discrepancy function
  • PLS path modeling prediction-based measures
  • Communality
  • Redundancy
  • Stone-Geissers Q2

58
Chi-square Statistic
  • Test of H0 S S(q) against non-specified
    alternative
  • Test-statistic X2(N-1)F(SS( ))
  • If model is just identified (cp) X20
    cK(K1)/2, p number of parameters in q
  • Under usual regularity conditions (normal
    distribution, ML-estimation), X2 is
    asymptotically ?2(c-p)-distributed
  • Non-significant X2 indicate the over-identified
    model does not differ from a just-identified
    version
  • Problem X2 increases with increasing N
  • Some prefer X2/(c-p) to X2 (has reduced
    sensitivity to sample size) rule of thumb
    X2/(c-p) lt 3 is acceptable

59
Goodness of Fit Indices
  • Goodness of Fit Index (Jöreskog Sörbom)
  • Portion of observed covariances explained by the
    model-implied covariances
  • How much better fits the model as compared to no
    model at all
  • Ranges from 0 (poor fit) to 1 (perfect fit)
  • Rule of thumb GFI gt 0.9
  • AGFI penalizes model complexity

60
Other Fit Indices
  • Normed Fit Index, NFI (Bentler Bonett)
  • Similar to GFI, but compares with a baseline
    model, typically the independence model
    (indicators are uncorrelated)
  • Ranges from 0 (poor fit) to 1 (perfect fit)
  • Rule of thumb NFI gt 0.9
  • Comparative Fit Index, CFI (Bentler)
  • Less depending of sample size than NFI
  • Non-Normed Fit Index, NNFI (Bentler Bonett)
  • Also known as Tucker-Lewis Index
  • Adjusted for model complexity
  • Root mean squared error of approximation, RMSEA
    (Steiger)

61
Assessment of PLS Results
  • Not a single but many optimization steps not a
    global measure but many measures of various
    aspects of results
  • Indices for assessing the predictive relevance
  • Portions of explained variance (R2)
  • Communality, redundancy, etc.
  • Stone-Geissers Q2
  • Reliability indices
  • NFI, assuming normality of indicators
  • Allows comparisons with covariance-fitting results

62
Some Indices
  • Assessment of diagonal fit (proportion of
    explained variances)
  • SMC (squared multiple correlation coefficient)
    R2 (average) proportion of the variance of LVs
    that is explained by other LVs concerns the
    inner model
  • Communality H2 (average) proportion of the
    variance of indicators that is explained by the
    LVs directly connected to it concerns the outer
    model
  • Redundancy F2 (average) proportion of the
    variance of indicators that is explained by
    predictor LVs of its own LV
  • r2 proportion of explained variance of
    indicators

63
Some Indices, contd
  • Assessment of non-diagonal fit
  • Explained indicator covariances
  • rs 1- c/s
  • with c rms(C), s rms(S) C estimate of
    Cov(e)
  • Explained latent variable correlation
  • rr 1- q/r
  • with q rms(Q), r rms(Cov(Y)) Q estimate of
    Cov(z)
  • reY rms (Cov(e,Y)), e outer residuals
  • reu rms (Cov(e,u)), u inner residuals
  • rms(A) (SiSj aij2)1/2 root mean squared
    covariances (diagonal elements of symmetric A
    excluded from summation)

64
Stone-Geissers Q2
  • Similar to R2
  • E sum of squared prediction errors O sum of
    squared deviations from mean
  • Prediction errors from resampling (blindfolding,
    jackknifing)
  • E.g., communality of Yij, an indicator of hi

65
Lohmöllers Advice
  • Check fit of outer model
  • Low unexplained portion of indicator variances
    and covariances
  • High communalities in reflective blocks, low
    residual covariances
  • Residual covariances between blocks close to zero
  • Covariances between outer residuals and latent
    variables close to zero
  • Check fit of inner model
  • Low unexplained portion of latent variable
    indicator variances and covariances
  • Check fit of total model
  • High redundancy coefficient
  • Low covariances of inner and outer residuals

66
ACSI Model Results
Perceived Quality
Voice
-0,38 -0,29
0,78 0,47
Custo- mer Satis- faction
0,90 0,73
0,17 (0,06)
0,95 0,53
0,57 0,35
(-0,15) 0,12
Loyalty
Expec- tations
Value
0,40 0,35
-0,24 (0,06)
EQS-estimates PLS-estimates
67
Diagnostics EQS
ACSI ACSIe
c2 247.5 378.7
df 81 173
NNFI 0.898 0.930
RMSEA 0.079 0.060
68
Diagnostics PLS (centroid weighting)
ACSI ACSI e Hui Schenk
R2 0.29 0.35 0.43 0.40
Q2 0.36 0.41 0.58 0.49
rr 0.47 0.55 0.58 0.59
H2 0.71 0.64 0.64 0.64
F2 0.22 0.24 0.30 0.26
r2 0.63 0.63 0.57 0.60
reY 0.26 0.24 0.19 0.09
reu 0.19 0.17 0.16 0.08
69
Practice of SEM Analysis
  • Theoretical basis
  • Data
  • Scaling metric or nominal (in LISREL not
    standard)
  • Sample-size a good choice is 10p (p number of
    parameters) lt5p cases might result in unstable
    estimates large number of cases will result in
    large values of X2
  • Reflective indicators are assumed to be
    uni-dimensional it is recommended to use
    principal axis extraction, Cronbachs alpha and
    similar to confirm the suitability of data
  • Model
  • Identification must be checked for covariance
    fitting methods
  • Indicators for LV can be formative or reflective
    formative indicators not supported in LISREL

70
Practice of SEM Analalysis contd
  • Model
  • LISREL allows for more general covariance
    structures e.g., correlation of measurement
    errors
  • Estimation
  • Repeat estimation with varying starting values
  • Diagnostic checks
  • Use graphical tools like plots of residuals etc.
  • Check each measurement model
  • Check each structural equation
  • Lohmöllers advice
  • Model trimming
  • Stepwise model building (Hui, 1982 Schenk, 2001)

71
LISREL vs PLS
  • Models
  • PLS assumes recursive inner structure
  • PLS allows for higher complexity w.r.t. B, G, and
    L LISREL w.r.t. Y and Q
  • Estimation method
  • Distributional assumptions in PLS not needed
  • Formative measurement model in PLS
  • Factor scores in PLS
  • PLS biased estimates, consistency at large
  • LISREL ML-theory
  • In PLS diagnostics much richer
  • Empirical facts
  • LISREL needs in general larger samples
  • LISREL needs more computation

72
The Extended Model
Emotional Factor
(0,20) 0,33
Perceived Quality
0,31 0,35
0,58 0,37
Custo- mer Satis- faction
0,55 0,36
0,85 0,53
0,87 0,73
(-0,14) (-0,01)
0,48 0,34
Loyalty
Expec- tations
Value
0,41 0,34
(-0,14) (0,06)
EQS-estimates PLS-estimates
73
Diagnostics EQS
ACSI ACSI e
c2 247.5 378.7
df 81 173
NNFI 0.898 0.930
RMSEA 0.079 0.060
74
Diagnostics PLS (centroid weighting)
ACSI ACSI e Hui Schenk
R2 0.29 0.35 0.43 0.40
Q2 0.36 0.41 0.58 0.49
rr 0.47 0.55 0.58 0.59
H2 0.71 0.64 0.64 0.64
F2 0.22 0.24 0.30 0.26
r2 0.63 0.63 0.57 0.60
reY 0.26 0.24 0.19 0.09
reu 0.19 0.17 0.16 0.08
75
Model Building Huis Approach
Emotional Factor
Perceived Quality
0,61
0,43
0,31
-0,18
Custo- mer Satis- faction
0,10
0,35
0,36
0,42
0,33
Expec- tations
-0,18
0,17
Loyalty
Value
0,63
0,21
0,23
0,12
76
Model Building Schenks Approach
Emotional Factor
Perceived Quality
0,32
0,35
Custo- mer Satis- faction
0,31
0,32
0,73
0,34
Expec- tations
0,60
Value
77
The end
78
Data-driven Specification
  • No solid a priori knowledge about relations among
    variables
  • Stepwise regression
  • Search of the best model
  • Forward selection
  • Backward elimination
  • Problem omitted variable bias
  • General to specific modeling

79
Stepwise SE Model Building
  • Hui (1982) models with interdependent inner
    relations
  • Schenk (2001) guaranties causal structure, i.e.,
    triangular matrix B of path coefficients in the
    inner model
  • ? B ? ?

80
Stepwise SE Model Building
  • Huis algorithm
  • Stage 1
  • Calculate case values Yij for LVs ?i as principal
    component of corresponding block, calculate R
    Corr(Y)
  • Choose for each endogenous LV the one with
    highest correlation to form a simple regression
  • Repeat until a stable model is reached
  • PLS-estimate the model, calculate case values,
    and recalculate R
  • Drop from each equation LVs with t-value tlt1,65
  • Add in each equation the LV with highest partial
    correlation with dependent LV

81
Stepwise SE Model Building
  • Huis algorithm, contd
  • Stage 2
  • Use rank condition for checking identifiability
    of each equation
  • Use 2SLS for estimating the path coefficients in
    each equation

82
Huis vs. Schenks Algorithm
  • Huis algorithm is not restricted to a causal
    structure allows cycles and an arbitrary
    structure of matrix B
  • Schenks algorithm
  • uses an iterative procedure similar to that used
    by Hui
  • makes use of a priori information about the
    structure of the causal chain connecting the
    latent variables
  • latent variables are to be sorted

83
Stepwise SE Model Building
  • Schenks algorithm
  • Calculate case values Yij for LVs ?i as principal
    component of corresponding block, calculate R
    Corr(Y)
  • Choose pair of LVs with highest correlation
  • Repeat until a stable model is reached
  • PLS-estimate the model, calculate case values,
    and recalculate R
  • Drop LVs with non-significant t-value
  • Add LV with highest correlation with already
    included LVs

84
Data, special CS dimensions
Staff 2 availability1 (PERS), politeness1 (FREU)
Outlet 3 make-up1 (GEST), presentation of mer-chandise1 (PRAE), cleanliness1 (SAUB)
Range 2 freshness and quality (QUAL), richness (VIEL)
Customer-orientation 7 access to outlet (ERRE), shopping hours (OEFF), queuing time for checkout1 (WART), paying modes1 (ZAHL), price information1 (PRAU), sales (SOND), availability of sales (VERF)
1 Dimension of Emotional Factor
85
References
  • C. Fornell (1992), A National Customer
    Satisfaction Barometer The Swedish Experience.
    Journal of Marketing, (56), 6-21.
  • C. Fornell and Jaesung Cha (1994), Partial Least
    Squares, pp. 52-78 in R.P. Bagozzi (ed.),
    Advanced Methods of Marketing Research.
    Blackwell.
  • J.B. Lohmöller (1989), Latent variable path
    modeling with partial least squares.
    Physica-Verlag.
  • H. Wold (1982), Soft modeling. The basic design
    and some extensions, in Vol.2 of Jöreskog-Wold
    (eds.), Systems under Indirect Observation.
    North-Holland.
  • H. Wold (1985), Partial Least Squares, pp.
    581-591 in S. Kotz, N.L. Johnson (eds.),
    Encyclopedia of Statistical Sciences, Vol. 6.
    Wiley.
Write a Comment
User Comments (0)
About PowerShow.com