Multivariate Analysis of Variance MANOVA - PowerPoint PPT Presentation

1 / 90
About This Presentation
Title:

Multivariate Analysis of Variance MANOVA

Description:

... of cognitive behaviour therapy (CBT) on obsessive compulsive disorder (OCD) ... have more obsessive. thoughts than actions. Equality of variances. Box's ... – PowerPoint PPT presentation

Number of Views:2963
Avg rating:3.0/5.0
Slides: 91
Provided by: iiMet
Category:

less

Transcript and Presenter's Notes

Title: Multivariate Analysis of Variance MANOVA


1
Chapter_14
Multivariate Analysis of Variance
(MANOVA)? (Field_2005)?
2
What can you do with a MANOVA?
  • Until now we had only measured a single dependent
    variable.
  • Therefore, ANOVA is called 'univariate'
  • ANOVA can have one or more independent
    variable(s)?
  • A MANOVA is an ANOVA for several dependent
    variables.
  • Therefore, MANOVA is called 'multivariate'
  • Like ANOVA, MANOVA can have one or more
    independent variables

3
Why MANOVA?
  • If you conducted separate ANOVA's, any
    relationship between the dependent variables is
    ignored. However, there might be correlations
    between them.
  • ? MANOVA can tell us whether groups differ along
    a combination of dimensions

4
Advantage of having multiple Dependent Variables
Expl Distinguishing the three groups of
'drivers', 'drunk drivers', and 'no drivers' by
...
  • ... multiple dependent variables as in MANOVA
  • Number of pedestrians killed
  • Number of lampposts hit
  • Number of cars they crash in
  • ...a single dependent variable as in ANOVA
  • Number of pedestrians killed

?MANOVA is more powerful in distinguishing these
groups since it has more information on a variety
of dependent variables
5
How many dependent variables?
  • Do not add any number of dependent variables you
    can think of but only reasonable ones which are
    theoretically and empirically motivated
  • If you want to explore some novel dependent
    variables you might run separate analyses for the
    theoretically motivated ones and for the
    explorative ones.

6
ControversiesMANOVA is a 2-staged test
  • 1. Overall test
  • There are 4 possible ways for assessing the
    overall effect of MANOVA
  • - Pillai-Bartlett trace (V)
  • - Hotelling's T2
  • - Wilks's lambda (?)?
  • - Roy's largest root
  • 2. Separate tests for the various group
    differences
  • There are two main ways of following up on the
    group differences
  • Univariate ANOVAs
  • Discriminant analysis

7
The power of MANOVA
  • MANOVA has greater power than ANOVA in detecting
    differences between groups.
  • However, there is a complex relationship in that
    the power of MANOVA depends on a combination of
    the correlation between Dep Var's and the effect
    size.
  • For large effects, MANOVA has greater power if
    the variables are different (even negatively
    correlated) and if the group differences are in
    the same directions for those variables.
  • For 2 variables, one of which has a large and one
    of which has a small effect, power will increase
    if the variables are highly correlated.

8
The example throughout the chapter
  • We want to assess the efffects of cognitive
    behaviour therapy (CBT) on obsessive compulsive
    disorder (OCD).
  • CBT will we compared with Behavior Therapy (BT)
    and with no-treatment (NT) as a control
    condition.
  • Since OCD manifests itself both behaviorally
    (obsessive actions) as well as cognitively
    (obsessive thoughts), both will be measured.
  • Note that the two dependent variables are
    theoretically motivated!

9
The data from OCD.sav
10
The theory of MANOVA
  • For understanding what is going on in a MANOVA we
    have to understand (a little bit of ) Matrices
  • A Matrix is a collection of numbers arranged in
    columns and rows.
  • Expls
  • 2x3 matrix 5x4 matrix
  • 1 2 3 1 2 3 4
  • 4 5 6 5 6 7 8
  • 9 1 3 5
  • 6 7 2 8
  • 0 5 2 8

The values within a matrix are called 'components'
or 'elements'
Each row data from 1 subjectcv
Each row data from 1 subject
Each column data from each variable
Each column data from each variable
Each column data from 1 variable
11
More on Matrices
  • A square matrix is one with an equal number of
    rows and columns, e.g.
  • 5 3 5 7 8
  • 4 2 1 0 5
  • 1 3 9 7 4
  • 1 3 5 8 0
  • 9 6 3 7 2
  • The red numbers are 'diagonal components', the
    black ones 'off-diagonal components'
  • An identity matrix is a square matrix in which
    the diagonal components are 1 and the
    off-diagonal components are 0
  • 1 0 0 0
  • 0 1 0 0
  • 0 0 1 0
  • 0 0 0 1

12
More on matrices
  • A matrix with data from only 1 person is called
    'row vector'. It can be thought of as a single
    person's score on five different variables
  • (5 3 5 7 8)?
  • A matrix with only one column is called 'column
    vector'. It can be thought of as five
    participants' score on a single variable
  • 8
  • 5
  • 7
  • 8
  • 2

13
Important matrices and their functions
  • MANOVA uses a matrix that contains information
    about the variance accounted for by the
    independent variables, for each dependent
    variable.
  • For each variance portion hypothesis (model),
    error, and total variance there is a sum of
    squares and cross-products matrix

The matrices are used like the simple sum of
squares (SSM, SSR, SST) for deriving the test
statistics.
14
What are 'Cross products'?
  • In the 'sum of squares and cross-products
    matrix', what do the 'cross products' mean?
  • It is the value for the total combined error
    between two variables. The represent the total
    correlation between two variables in an
    unstandardized way.
  • It is these cross-products in terms of which
    MANOVA accounts for any correlation between
    dependent variables.

15
Calculating MANOVA by hand(using the OCD
data)First approach two univariate ANOVAs
  • Univariate ANOVA for DV1 (actions)
  • For the first dependent variable 'number of
    compulsive actions' we have to determine 3
    variance portions
  • Total, Model, and Residual Sum of Squares.

SST(Action)?
SSM(Action)?
SSR(Action)?
16
Calculating SST, SSM, and SSR for Action
Grand Variance 'Actions'
  • SST s2grand (n-1)?
  • 2.1195 (30-1)?
  • 61.47

17
Calculating SST, SSM, and SSR for Action
  • SSM Summing up the differences between each
    group mean and the grand mean, squaring it and
    multiplying by the number of subjects in the
    group
  • SSM 10(4.9-4.53)2 10(3.7-4.53)2 10(5-4.53)2
  • 10(0.37)2 10(-0.83)2 10(0.47)2
  • 1.37 6.68 2.21
  • 10.47

18
Calculating SST, SSM, and SSR for Action
  • SSR Taking the variance within each group,
    multiplying them with the n of scores -1 and
    adding them all up
  • SSR s2CBT (nCBT -1) s2BT (nBT -1) s2NT
    (nNT -1)?
  • 1.433(10-1) 3.122(10-1) 1.111(10-1)?
  • (1.433x9) (3.122x9) (1.111x9)?
  • 12.9 28.1 10.00
  • 51.00

SST SSMSSR 61.47-10.4751.00
19
Calculating Mean Sum of Squares MST, MSM, and
MSR for Action
  • We divide SSM and SSR by their df's and derive
    the Mean sum of squares.

df (SSM) k-1 3-12 df (SSR) k(n-1)
3(10-1) 3927
Last, we divide MSM by MSR and derive the
F-ratio F MSM 5.235 2.771 MSR
1.889 The F-value has to be compared against
the critical F-value F(2,27) 3.355. Since the
F-value of our model is smaller than the critical
F-value, we reject the hypothesis that the 3
therapies differ in their effect on compulsive
actions.
Fcrit (2,27) 3.355
20
Univariate ANOVA for DV2 (thought)For the
second dependent variable 'number of compulsive
thoughts' we also have to determine the 3
variance portions Total, Model, and Residual
Sum of Squares.
SST(Thought)?
SSM(Thought)?
SSR(Thought)?
21
Calculating SST, SSM, and SSR for Thought
Grand Variance 'Thoughts'
SST s2grand (n-1)? 4.8780 (30-1)?
141.47
22
Calculating SST, SSM, and SSR for Thought
  • SSM Summing up the differences between each
    group mean and the grand mean, squaring it and
    multiplying by the number of subjects in the
    group
  • SSM 10(13.40-14.53)2 10(15.2-14.53)2
    10(15-14.53)2
  • 10(-1.13)2 10(0,67)2 10(0.47)2
  • 12.77 4.49 2.21
  • 19.47

23
Calculating SST, SSM, and SSR for Thought
  • SSR Taking the variance within each group,
    multiplying them with the n of scores -1 and
    adding them all up
  • SSR s2CBT (nCBT -1) s2BT (nBT -1) s2NT
    (nNT -1)?
  • 3.6(10-1) 4.4(10-1) 5.56(10-1)?
  • (3.6x9) (4.4x9) (5.56x9)?
  • 32.4 39.6 50.00
  • 122.00

SST SSMSSR 141.47-19.47122.00
24
Calculating Mean Sum of Squares MST, MSM, and
MSR for Thought
  • We divide SSM and SSR by their df's and derive
    the Mean sum of squares.

Last, we divide MSM by MSR and derive the
F-ratio F MSM 9.735 2.154 MSR
4.519 The F-value has to be compared against
the critical F-value F(2,27) 3.355. Since the
F-value of our model is smaller than the critical
F-value, we reject the hypothesis that the 3
therapies differ in their effect on compulsive
thoughts.
Fcrit (2,27) 3.355
25
The relationships between the 2 DVs
  • MANOVA takes into consideration the relationship
    between the DVs by way of calculating their
    cross-products.
  • More specifically, there are 3 cross-products
    which relate to the 3 SSs that we have calculated
    within the Univariate ANOVAs
  • - Total cross-product CPT
  • - Model cross-product CPM
  • - Residual cross-product CPR

26
Calculating the total cross-product, CPT
  • The total cross-product between our 2 Dvs
    'action' and 'thought' is calculated by the
    following equation
  • CPT ???xi(Action)-?Xgrand(Action))(Xi(Thoughts)
    -?Xgrand(Thoughts))?
  • For all subjects, we have to add up the product
    of the differences between the individual scores
    for action and thought minus the respective grand
    means.

27
Total Cross-Product CPT
The Total Cross-product tells us how the two
dependent Variables, DV1 and DV2, are related,
overall.
The Total Cross-Product CPT 5.47
28
Calculating the Model cross-product, CPM
  • The model cross-product tells us how the
    relationship between our 2 DVs 'action' and
    'thought' is affected by our experimental
    manipulation
  • CPM ??n???xgroup(Actions)-?Xgrand(Action))(?xgro
    up(Thoughts) -?Xgrand(Thoughts))
  • For all 3 experimental groups, we have to add up
    the product of the differences between the group
    means for action and thought minus the respective
    grand means.

29
Model Cross-Product CPM
The Model Cross-Product CPM -7.53
30
Calculating the Residual cross-product, CPR
  • The residual cross-product tells us how the
    relationship between our 2 DVs 'action' and
    'thought' is affected by individual differences
    (errors) in the model
  • CPR ??xi(Actions)-?Xgroup(Action))(xi(Thoughts)
    -?Xgroup(Thoughts))?
  • For all subjects, we have to add up the product
    of the differences between the individual scores
    for action and thought minus the respective group
    means.
  • An easier way to calculate CPR is to subtract CPM
    from CPT
  • CPR CPT - CPM 5.47 - (-7.53) 13

31
Residual Cross Product CPR
The CPR is similar to the CPT only that it is
the group means and not the grand means that
are subtracted from the individual scores.
The Residual Cross-Product CPR 13
32
The Sum of squares cross- product (SSCP) matrices
We shall now represent the total, residual, and
model Sum of squares and their respective
cross-products in matrices. These combinatorial
matrices are called Sum of squares cross-
product (SSCP) matrices There are 3 of
them T total SSCPT E error SSCPE H model
SSCPH ('H' for 'Hypothesis)? Since we have 2
DVs, we will always have 2x2 matrices for all
SSCP-matrices T,E, and H
33
The total Sum of squares cross product (SSCP)
matrix (T)?
  • The T matrix represents the Total Sum of squares
    of each DV (SST(Actions) and SST(Thoughts) ) as
    well as their cross-product (CPT), the total
    co-dependence between the 2 DVs.

T
Note The values for CPT are the same for both
DVs
34
The residual (error) Sum of squares cross product
(SSCP) matrix (E)?
  • The E matrix represents the Residual or Error Sum
    of squares of each DV (SSE(Actions) and
    SSR(Thoughts) ) as well as their cross-product
    (CPR), the residual co-dependence between the 2
    Dvs.

E
Note The values for CPR are the same for both
DVs
35
The Model (Hypothesis) Sum of squares cross
product (SSCP) matrix (H)?
  • The H matrix represents the Model or Hypothesis
    Sum of squares of each DV (SSM(Actions) and
    SSM(Thoughts) ) as well as their cross-product
    (CPM), the model co-dependence between the 2 Dvs.

H
Note The values for CPR are the same for both
DVs
36
Checking the matrices
H (Model) E (Error) T (total)?

  • You can calculate with matrices as you can with
    simple numbers. Thus, H E T.
  • This is a way of checking whether the numbers in
    the matrices are right. They are!

37
Principle of the MANOVA test statistic
  • In univariate ANOVA, we divide MSM/MSR in order
    to obtain the F-value.
  • In multivariate ANOVA, we would have to divide
    H/E then.
  • Problem H and E are matrices and matrices cannot
    be readily divided.
  • Solution The equivalent to division for matrices
    is matrix inversion, hence H is multiplied by the
    inverse of E, called E-1. The product is HE-1.

38
Matrix inversion
  • Matrix inversion is too difficult to be dealt
    with here. We just have to take for granted in
    this example that for
  • E E-1
  • H HE-1
  • Thus, our test statistics will be based on HE-1.

39
Test statistic
  • HE-1represents the ratio of systematic variance
    of the model to the unsystematic variance of the
    error. So it is conceptually the same as the
    F-ration.
  • Problem The F-ratio is a single value, whereas
    now we have several ones. We will always have as
    many as the square of the numbers of DVs, in our
    example 224.
  • Solution the DVs are converted into underlying
    dimensions or factors, so-called 'eigenvalues' .
  • ? If you want to know how HE-1 is computed, look
    into the Appendix for Chapter 14 on Field's CD.

40
Discriminant function variates
  • Representing the DVs by underlying dimensions is
    like working back in a regression, namely to
    derive from a set of Dependent Variables the
    underlying Independent Variables. These linear
    combinations of the DVs are called variates,
    latent variables, or factors.
  • Knowing these linear variates, we can predict
    which group (here, therapy group) a person
    belongs to. Since the variates are used to
    discriminate groups of people, they are called
    discriminant function variates.

41
Discriminant function variates
  • How do we find the discriminant function
    variates?
  • By maximization which means that the first
    discriminant function (V1) is the linear
    combination of dependent variables that maximizes
    the differences between groups.
  • Hence the ratio of systematic to unsystematic
    variance (SSM/SSR) will be maximized for V1. For
    subsequent variates (V2, etc.), this ratio will
    be smaller.
  • Practically, we obtain the maximum possible value
    of the F-ratio when we look at V1.

42
Discriminant function variates
  • The variate V1 can be described as a linear
    regression equation, where the 2 DVs are the
    predicting values and V1 is the predicted value
  • Y b0 b1X1 b2X2
  • V1 b0 b1DV1 b2DV2
  • V1 b0 b1Actions1 b2Thoughts2
  • In linear regression, the b-values are the
    weights of the predictors. In discriminant
    function analysis they are obtained from
    eigenvectors of the HE-1 matrix .
  • b0 can be ignored since we are only interested
    in the discrimination function and not in the
    constant.

Note that by looking for the underlying factors
of the Dep Var's, they become predictors (like
Indep Var's). This is because we want to find out
what dimension(s)? underlie them.
43
Discriminant function variates
  • How many variates are there?
  • ? The smaller number of either p or (k-1) where
    p is the number of DVs and k-1 is the number of
    levels of the independent variable. In our case,
    both yield 2.
  • ? We will find 2 variates.
  • The b-values of the 2 variates are derived from
    the eigenvalues of the matrix HE-1. There will be
    2 such matrices with 2 eigenvalues, one for each
    variate.

44
Discriminant function variates
  • An eigenvector is a vector of a matrix which is
    unchanged by transformations of that matrix to a
    diagonal matrix, i.e., one with only diagonal
    elements.
  • By changing HE-1 into a diagonal matrix we reduce
    the numbers of elements we have to consider for
    testing significance while preserving the ratio
    of systematic vs. unsystematic variance.
  • We won't calculate those eigenvalues ourselves
  • but just adopt them from the book
    (Field_2005_589)
  • eigenvector1 0.603
  • for variate 1 -0.335
  • eigenvector2 0.425
  • for variate 2 0.339

These are the b's for variate 1 and 2 in the
regression equation
These are the b's for variate 1 and 2 in the
regression equation
45
The regression equation for the 2 variates
  • V1 b0 b1Actions1 b2Thoughts2
  • (b0 can be omitted since it plays
  • no role in the discrimination function)?
  • Variate V1 0.603 Actions - 0.335 Thoughts
  • Variate V2 0.425 Actions 0.339 Thoughts

46
The discriminant function
  • The equation can be used to calculate a score for
    each subject on the variate.
  • Exp. Subject 1 in the CBT had 5 obsessive
    actions and 14 obsessive thoughts. His scores for
    variate 1 and 2 are
  • V1 (0.603 x 5) (0.335 x 14) -1.675
  • V2 (0.425 x 5) (0.339 x 14) 6.871

47
The virtue of the data reduction
  • Calculating such scores for each subjects and
    then calculate the SSCP matrices (H,E,T, and
  • HE-1) leads to zero cross-products. This is
    because the variates extracted from the data are
    uncorrelated, so no cross-product can arise. This
    has the effect that all non-diagonal elements
    (the cross-products) vanish. Only the diagonal
    elements remain which represent the ratio of the
    systematic to the unsystematic variation. This is
    a welcome data reduction.
  • The HE-1matrix for our two variates is
  • HE-1variates

The eigenvalues of the original HE-1 matrix are
0.335 and 0.073
? If you want to know how the eigen-values (the
roots of the square matrix HE-1) computed, look
into the Appendix for Chapter 14 on Field's CD.
48
Eigenvalues as F-ratios
  • The eigenvalues we have just derived (0.335 and
    0.073) are the conceptual analog to the F-ratio
    in univariate ANOVA.
  • Once we have them, they have to be compared
    against the value that would result by chance
    alone.
  • There are 4 ways how those chance values can be
    calculated
  • - Pillai-Bartlett trace (V)?
  • - Hotelling's T2
  • - Wilks's lambda (?)?
  • - Roy's largest root

All those values are variations of a common
theme the ratio of explained to unexplained
variance, and correspond more or less to the
F-ratio SSM/SSR in univariate ANOVA.
49
Pillai-(Bartlett) trace (V)?
Action- and Thought- eigenvalues
  • s
  • V ??i/(1 ?i)?
  • i1
  • V 0.335 0.073 0.319
  • 1 0.335 10.073
  • ??is the eigenvalue for each of the discriminant
    variates, s is the number of variates. Pillai's
    trace is thus the sum of the proportion of
    explained variance on the discriminant functions.
    It directly corresponds to SSM/SST.

50
Hotelling's T2
  • Hotelling's T2 is simply the sum of the
    eigenvalues for each variate.
  • s
  • T ??i 0.335 0.073 0.408
  • i1
  • Here, we sum SSM/SSR for each of the variates. It
    compares directly to the F-ratio in ANOVA.

Action- and Thought- eigenvalues
51
Wilks' lambda (?)?
  • Wilks' lambda(?) is the product of the
    unexplained variance on each of the variates. The
    symbol ? is similar to the summation symbol ?,
    but it means 'multiply' rather than 'add'.
  • s
  • ? ?i 1/(1 ?i)
  • i1
  • 1/(10.335) (1/(1 0.073) 0.698
  • Wilks' lambda represents the ratio of error
    variance to total variance (SSR/SST) for each
    variate.

Action- and Thought- eigenvalues
??? 'multiply' ??? 'add'
52
Roy's largest root
Action- and Thought- eigenvalues
  • Roy's largest root is simply the largest
    eigenvalue, here 0.335, the eigenvalue of the
    first variate.
  • It is thus the ratio of the explained to
    unexplained variance of the first discrimination
    function.
  • Again, it is conceptually the same as SSM/SSR in
    univariate ANOVA.
  • Since the first variate is the maximum value for
    the discrimination of the between-group
    differences, taking it as statistics is often
    most powerful.

53
Assumptions of MANOVA
  • MANOVA adds further assumptions to the familiar
    assumptions for ANOVA
  • Independence Observations have to be
    statistically independent.
  • Random sampling data have to be sampled randomly
    from the population on the interval scale level
  • Multivariate normality The dependent variables
    have to have multivariate normality with groups,
    collectively.
  • Homogeneity of covariance matrices The variances
    within all groups on all DVs have to be the same.
    Furthermore, the correlation between any 2 DVs
    has to be the same in all groups. It has to be
    tested whether the population variance-covariance
    matrices of the different groups in the analysis
    are equal.

54
What is 'multivariate normality'?
In probability theory and statistics, a
multivariate normal distribution, also sometimes
called a multivariate Gaussian distribution, is a
specific probability distribution, which can be
thought of as a generalization to higher
dimensions of the one-dimensional normal
distribution (also called a Gaussian
distribution). It is also closely related to
matrix normal distribution.
http//en.wikipedia.org/wiki/Multivariate_normal_d
istribution
55
What does 'multivariate normality' look
like?(Excerpt from http//nitro.biosci.arizona.ed
u/zdownload/Volume2/Appendix02.pdf)?
The eigenvalues yield the axes of symmetry, here
for n2 principal components
The resulting multivariate normality plot is not
a curve but a 'mountain' in a 3-dim space
56
Checking multivariate assumptions
  • The assumption of multivariate normality cannot
    be directly checked in SPSS. Alternatively, for
    each DV separately, univariate normality has to
    be checked. However, this is a necessary but not
    sufficient condition.
  • The assumption of equality of covariance matrices
    presupposes equality of variances between groups.
    This can be checked by Levene's test. Levene's
    test should be ns for any of the dependent
    variables. The variance-covariance matrices have
    to be compared between groups using Box's test.
    (Since Box's test relies on multivariate
    normality, this assumption has always to be
    checked first).

57
Chosing a test statistic wrt to power
  • Which of the 4 test statistics shall we use?
  • For small and medium-sized groups all 4 have
    similar statistical power.
  • If groups differ most on the 1st variate, 'Roy's
    largest root' is the most powerful statistic. If
    they differ on more than 1 variate, Pillai's
    trace is better.
  • If you only have a medium-sized sample, you
    should not use too many DVs.
  • All four statistics are relatively robust in
    terms of violations of multivariate assumptions.

58
Follow-up analysis univariate ANOVA
  • Traditionally, MANOVA is always followed up by
    univariate ANOVAs for the single DVs. However, a
    Bonferroni correction should be applied to
    correct for the increased family-wise error.
  • Note that univariate ANOVAs are NOT in the spirit
    of MANOVAs since you can only find out about any
    single DV, not about the joint contribution of
    the various DVs.
  • Single ANOVAs are only justified after the
    overall MANOVA has proved significant.

59
Follow-up analysis discriminant analysis
  • An alternative is to use 'discriminant analysis'
    which finds the linear combination(s) of the
    dependent variables that best discriminate(s)
    between the experimental groups.
  • Here, emphasis is laid on the relationships that
    exist between the DVs.
  • 'Discriminant analysis' reduces the DVs in terms
    of a set of underlying dimensions, not single
    dimensions as univariate ANOVAs do.

60
MANOVA on SPSS(using OCD.sav)?
  • Analyze ? General Linear Model ? Multivariate

Model Chose a full factorial design
Transfer 'actions' and 'thoughts' to the
Dependent Var's Window
Transfer 'group' to the 'Fixed Factors' Window
There is space for a covariate. This would then
be a MANCOVA
61
Multiple comparisons contrasts
Specify a 'simple' contrast for the
independent variable 'group'. This will yield
the following contrasts 1-3 2-3
Since our control group 'NT (no therapy)' is the
last group we should chose it as the
reference category
62
Post-hoc tests
Select REGW-Q and Games-Howell for the Post-hoc
tests
63
Options
Tick 'Bonferroni' correction!
The left-hand side options request the SSCP and
the error SSCP matrix
The right-hand side options are the same as for
ANOVA
64
Output of MANOVAPreliminary analysis and testing
assumptions
  • Descriptives

Check the labels of the 3 levels of the
independent variable 'group'
Descriptive statistics for both DVs for the 3
therapies patients have more obsessive thoughts
than actions
65
Equality of variances
Box's Test tests whether the covariance matrices
are equal. Since Box's test is n.s. this
assumption is met.
Box's test should be ignored when sample sizes
(of the 2 groups) are equal. Then hotelling's T2
and Pillai's statistics are robust. Only when
sample sizes differ Box's test should be
consulted. It should, however, not be trusted if
multivariate normality cannot be assumed. A
value of lt0.001 is dangerous.
66
Sphericity
You can ignore Bartlett's Test of Sphericity.
Remember Sphericity comes only into play in
repeated- measures designs.
67
MANOVA test statistics
  • All 4 test statistics are produced. Except
    Hotelling's Trace, they are all significant.
  • ? We can conclude that there is an overall
    effect. However, we don't know for which group or
    dependent variable the effect holds

68
Following up on MANOVA (I) Univariate ANOVAs
Levene's test tests equality of variances. For
both DVs equality can be assumed.
  • Given the equality of variances for both DVs, we
    can trust the MANOVA.

69
Univariate ANOVAs Between-subjects effects
There are n.s. Effects for 'group', neither for
'actions' nor for 'thoughts' !!!
  • How can we explain the contradictory results that
    MANOVA found a significant effect of 'group'
    while the univariate ANOVAs did not?
  • ? because MANOVA has found that the groups differ
    along a combination of both DVs and not on any
    single one.

70
SSCP matricesThe model SSCP (H) and the
residual SSCP (E)?
Same numbers as in our previous calculations
The Model SSCP (H) 'Hypothesis GROUP'
H
E
The Residual SSCP (E) 'Error'
  • The big SS's in the Error matrix (51 and 122)
    tell us that the MANOVA is significant because
    there is a significant effect through the
    relationship between the DVs.

71
SSCP Error matrix (E) The average SSCP
The covariance is the SSs divided by the df,
here k(n-1) 3 x (10-1)27. e.g., 51/271.889 12
2/274.519
The correlation represents the standardized form
of the variance-covariance matrix.
  • Note Bartlett's Test of sphericity is based on
    this matrix.

72
Contrasts
Only in the 2vs3 contrast for action the CIs do
NOT cross 0, i.e., we can be sure that the true
value is not 0.
CBT vs. NT
n.s.
n.s.
BT vs. NT

n.s.
  • The only significant contrast is between BT
    'Behavior therapy' (2) and NT 'No Treatment' (3)
    in the DV 'action'.
  • ? BT reduces compulsive actions as compared to
    NT.
  • Note The simple contrasts are carried out on
    each DV separately, hence like in univariate
    ANOVA.

73
Following up on MANOVA (II) Discriminant Analysis
  • This 2nd alternative is recommended more than
    separate ANOVAs.
  • Analyze ? Classify ? Discriminant...

Transfer 'group' to the Grouping Variable
window. Transfer 'actions' and 'thoughts' to the
Independents window.
Transfer 'group' to the Grouping Variable
window. Transfer 'actions' and 'thoughts' to the
Independents window.
Define the range of the grouping variable 1-3
Enter Indept's together
Discriminant Analysis looks for predictors for
separating a set of groups most efficiently.
74
Following up on MANOVA (II) Discriminant
Analysis Statistics
Most of these options we have already in the
MANOVA output. 'Unstandardized' produces
b-values for each variate
Tells us about the relation between both DVs for
each group
75
Following up on MANOVA (II) Discriminant
Analysis Classify
If your groups are not equally- sized,
compute probabilities from group sizes.
A plot with the variate scores for
participants grouped according to therapy will be
produced
76
Following up on MANOVA (II) Discriminant
Analysis Classify
Discriminant scores for each subject, on each
variate, will be produced. They will occur in the
original data file
Finally, press 'OK'
77
Output of Discriminant Analysis
The covariances are obtained by taking the
cross-products (D1xD2) between the DVs for each
group and dividing them by the dfs (n-1).
Cross-product matrix
  • D1xD2/n-1
  • 0.4/9 0.044
  • 22.6/9 2.511
  • -10/9 -1.111

78
Output of Discriminant Analysis
  • In the CBT group, the 2 Dvs (compulsive
    action/thought) are not related at all.
  • In the BT group, there is a positive relation
    between them
  • In the NT group, a negative relation.
  • We cannot say anything about the substantive
    relationship between the 2 DVs yet, since they
    are unstandardized.

79
Output of Discriminant Analysis
Action- and Thought- eigenvalues
  • The eigenvalues of the 2 variates correspond to
    those we had calculated before. V1 accounts for
    82.2 of the variance V2 for 17.8.
  • ? The group differences shown in MANOVA can be
    accounted for by a single underlying factor.

Wilks' Lambda is but only for the 1st variate.
80
Output of Discriminant Analysis
The standardized coefficients tell us how high
the DVs load on the two variates. They are
equivalent to the standardized Betas in linear
regression. They can vary from -1 to 1. ACTIONS
load highly positive on V1 THOUGHTS highly
negative. (V2 can be ignored)?
variate
variate
Groups with values opposite in sign (/-) are
discriminated by a variate. Both Dep Var's are
important since they have similar strong
coefficients/ factor loadings, only in opposite
directions.
The structure matrix gives us the canonical
variate correlation coefficients. These are like
factor loadings and tell us about the substantive
nature of the variates. As in the above matrix,
the coefficient is positive for 'actions'
and negative for 'thoughts'.
The positive loading of the first and the
negative loading of the second Dep var indicate
that the three therapy groups are separated in
terms of the difference between the Dep var's.
81
Output of Discriminant Analysis
The canonical discriminant function coefficients
are the unstandardized versions of the
standardized coefficients above. They are the
values of b in the regression equation V1 b0
b1Actions1 b2Thoughts2
variate
variate
Compare the b's for variate 1 and 2 previously
Compare the b's for variate 1 and 2 previously
The unstandardized values are less readily
interpretable but the standardized ones are
derived from them.
82
Output of Discriminant Analysis
The variate centroids are the mean variate
scores for each group. The centroids for V1
discriminates most strongly the CBT (0.601) from
the BT group (-.726). (If we want to look at
the n.s. V2, the centroids discriminate most
strongly the 2 therapy groups vs. the
No-treatment group.)?
variate
variate
83
Output of Discriminant Analysis
The group centroids (mean variate scores) of V1
discriminate between the CBT and BT group.
Variate 1 is shown on the horizontal axis
(function 1)? whereas variate 2 is shown on
the vertical axis (function 2). The centroids of
V1 discriminate well (consider the horizontal
difference between them). The centroids of V2 do
not discriminate much (The centroid of NT is
hidden behinda blue dot. The vertical distances
between the centroids are small)?
The group centroid plot shows the mean variates
for each group along with the scores of each
individual
84
The final interpretation
  • How can we interpret the statistical results of
    MANOVA what does all this mean?
  • The insignificant univarite ANOVAs tell us that
    the improvement is not simply in terms of
    'actions' or 'thoughts'.
  • MANOVA tells us that therapy groups can indeed be
    differentiated by a single underlying dimension
    which is neither 'action' nor 'thought' but the
    'Obsessive Compulsive Disorder' (OCD) itself. OCD
    is composed of compulsive actions and thoughts
    alike
  • The nature of the influence of therapy on OCD is
    unclear, however...

85
The final interpretation
  • Which is the best therapy?
  • In order to decide this, we have to look at the
    relation between the 2 DVs in the data for all 3
    therapy groups and compare their means.

86
Correlations between actions and thoughts in the
3 groups
BT positive relation between actionsthoughts
CBT no relation between actionsthoughts
Select cases (group1 group2 group3)
and request Graphs? Interactive? Scatterplots,
along with a regression line.
NT incidental negative relation (n.s) between
actionsthoughts
87
Means between the DVs 'action' and 'thought' for
each group
The means of 'actions' and 'thoughts for all 3
groups show that compulsive actions are somewhat
reduced in BT while compulsive thoughts are in
CBT, as compared to NT.
88
Comparing the 3 therapy groups
  • The discriminant analysis told us that actions
    are more important in terms of OCD the
    standardized canonical discriminant function
    coefficient for 'action' loaded highly positive
    on the first variate V1 (.829) while 'thoughts'
    loaded highly negative (-.713). The same pattern
    was obtained in the structure matrix and for the
    distance between the group centroids (means).
  • ? Hence, Behavior Therapy (BT) seems to be best
    since it reduces actions more than any other
    group, esp. CBT. However, we do not know whether
    BT was any better than NT.

89
The 3 therapy groups and the construct OCD
  • ...Behavior Therapy (BT) has the most influence
    on OCD as a construct, because of the relative
    importance of behaviours in that construct
    compared to cognitions. (Field, 2005, 616).

90
Univariate ANOVA AND Discriminant Analysis
  • You should run Univariate ANOVA after MANOVA in
    order to fully understand the data.
  • You should also run a Discriminant Analysis since
    it informs you best about the underlying
    dimensions of the various DVs.
Write a Comment
User Comments (0)
About PowerShow.com