Quantitative Issues in Research Methods - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Quantitative Issues in Research Methods

Description:

3. Lousy or inappropriate measures. Poor reliability (e.g., a .70) ... 10. Believing that fancy statistics will compensate for lousy measures or a poor design. ... – PowerPoint PPT presentation

Number of Views:101
Avg rating:3.0/5.0
Slides: 32
Provided by: itg172
Category:

less

Transcript and Presenter's Notes

Title: Quantitative Issues in Research Methods


1
Quantitative Issues in Research Methods
ANZAM Mid-Year Doctoral Workshop 9 June 2009
David V. Day Woodside Professor of Leadership and
Management University of Western Australia
Business School
2
Session Overview
  • Introductions
  • Measurement, Design, Analysis
  • 10 Common Research Mistakes
  • Designing Problem-Centred Research
  • Discussion and Additional Issues

3
Statistics Throughout History
  • There are 3 kinds of lies Lies, damned lies,
    and statistics. Mark Twain (1906) courtesy of
    Benjamin Disraeli
  • Some people hate the very name statistics, but I
    find them full of beauty and interest. Whenever
    they are not brutalised, but delicately handled
    by the higher methods, and are warily
    interpreted, their power of dealing with
    complicated phenomena is extraordinary.
    --Sir Francis Galton (1889)
  • No amount of technical proficiency will do you
    any good if you do not think. Pedhazur
    Schmelkin (1991)

4
(No Transcript)
5
Systems Approach to Research
MEASUREMENT
RESEARCH PROBLEM
DESIGN
ANALYSIS
Pedhazur, E. J., Schmelkin, L. P. (1991).
Measurement, design, and analysis An integrated
approach. Hillsdale, NJ Erlbaum.
6
Systems Approach to Research
  • MEASUREMENT
  • Quantification of constructs or objects
  • Supplies the numbers used in statistical analyses
  • Meaningless numbers meaningless results!
  • Scales of measurement
  • Nominal, ordinal, interval, ratio
  • Classical Test Theory (X T E)
  • Reliability
  • Stability
  • Internal consistency
  • Validity
  • Numbers measure what the claim to measure
  • Reference
  • Nunnally, J., Bernstein, I. H. (1993).
    Psychometric theory (3rd ed.). New York McGraw-
    Hill.

7
Systems Approach to Research
  • DESIGN
  • Research questions
  • General, theory-based statements about study
    goals
  • Hypotheses
  • Specific statements about the predicted
    relationships between or among variables
  • Causality
  • Random assignment
  • Control condition
  • Manipulation of IV
  • Threats to validity (alternative explanations)
  • Statistical conclusions inferences from
    statistical tests
  • Internal assertions regarding effects of IV(s)
    on DV(s)
  • Construct correspondence between measure and
    construct
  • External generalizability of findings to target
    populations, settings
  • Reference
  • Cook, T. D., Campbell, D. T. (1979).
    Quasi-experimentation Design and analysis for
    field settings. Boston Houghton Mifflin.

8
Systems Approach to Research
  • Analysis
  • Univariate
  • Mean, standard deviation
  • Bivariate
  • Correlation
  • Multivariate
  • Multiple regression
  • Structural equation modeling
  • Types of statistics
  • Nonparametric statistics
  • Descriptive statistics
  • Inferential statistics
  • Normality
  • Homoscedasticity
  • Independence

Reference Cohen et al. (2003). Applied multiple
regression/ correlation analysis for the
behavioral sciences (3rd ed.). Mahwah, NJ Erlbaum
9
10 Common Research Mistakes
And how to avoid them
10
1. No theory.
11
What Theory is Not
  • References are not theory.
  • Data are not theory.
  • Lists of variables or constructs are not theory.
  • Diagrams are not theory.
  • Hypotheses are not theory
  • Reference
  • Sutton, R. I., Staw, B. M. (1995). What theory
    is not. Administrative Science Quarterly, 40,
    371-384.

12
2. Untestable hypotheses.
13
Whats wrong?
  • H1 Job satisfaction occurs only under certain
    conditions.
  • H2 Job satisfaction significantly predicts job
    performance.
  • H3 Job satisfaction positively influences job
    performance.
  • H4 Job satisfaction is unrelated to performance.

14
  • 3. Lousy or inappropriate measures.

15
Common Measurement Problems
  • Poor reliability (e.g., a lt .70) lousy measure
  • Jingle fallacy
  • Two constructs with equivalent labels are in
    reality quite different (e.g., measures labelled
    impulsivity may reflect constructs as diverse as
    a short attention span and a tendency to
    participate in risky behaviour).
  • Jangle fallacy
  • Two constructs with different labels are actually
    the same
  • Reference
  • Block, J. (1995). A contrarian view of the
    five-factor approach to personality description.
    Psychological Bulletin 117, 187215.

16
  • 4. Single-shot, cross-sectional, self-report
    survey designs.

17
Common Method Bias
  • Common Method Variance
  • Variance attributable to measurement method
    rather than to the constructs the measures
    represent.
  • Sources of Common Method Biases
  • Common source or rater
  • Item characteristics (e.g., social desirability)
  • Item context
  • Measurement context
  • Study design vs. Statistical remedies
  • Reference
  • Podsakoff, P. M., MacKenzie, S. B., Lee, J.-Y.,
    Podsakoff, N. P. (2003). Common method biases in
    behavioral research A critical review of the
    literature and recommended remedies. Journal of
    Applied Psychology, 88, 879-903.

18
  • 5. Violating statistical assumptions.

19
Whats wrong?
  • Employees (n200) rate their supervisors (n50)
    on perceived leadership behaviour.
  • Leadership ratings are correlated with job
    satisfaction ratings of employees.
  • Correlation found to be large (r .75) and
    statistically significant (p lt .001).
  • WOW!!

20
Comparing OLS with HLM
Y b0 b1X1 e
Y
X
21
  • 6. Model conceptual confusion.

22
Mediators Moderators
  • Mediation
  • X ? M ? Y
  • Moderation
  • X ? Y varies as a function of Z (moderator)
  • Moderated mediation (Z moderates X?M)
  • Mediated moderation (Z moderates M?Y)
  • Reference
  • Edwards, J. R., Lambert, L. S. (2007). Methods
    for integrating moderation and mediation A
    general analytical framework using moderated path
    analysis. Psychological Methods, 12, 1-22.

23
  • 7. Levels of analysis confusion.

24
Whats wrong?
  • Researcher hypothesizes that team climate is
    related to team performance.
  • Team members complete individual climate surveys,
    which are correlated with team performance.
  • Whats wrong? (Hint Within-group agreement or
    reliability)
  • Reference
  • Bliese, P. D. (2000). Within-group agreement,
    non-independence, and reliability Implications
    for data aggregation and analysis. In K. J. Klein
    S. W. J. Kozlowski (Eds.), Multilevel theory,
    research, and methods in organizations
    Foundations, extensions, and new directions (pp.
    349-381). San Francisco Jossey-Bass.

25
  • 8. Making strong causal inferences from weak
    research designs.

26
Whats wrong?
  • Researcher hypothesizes mediational relationship
  • Well-being (M) mediates the relationship between
    job autonomy (X) and job performance (Y)
  • Support found for full mediation X?M?Y
  • No effect of X directly on Y (partial mediation)
  • Researcher concludes that X influences M and that
    M affects Y

27
9. Inadequate statistical power.
28
Concepts of Power Analysis
  • Type I error (a)
  • Reject null hypothesis when true (false positive)
  • Type II error (ß)
  • Fail to reject null when false (false negative)
  • Statistical power (1-ß)
  • Rejecting null when false (not making Type II
    error)
  • Function of a, effect size, and SAMPLE SIZE
  • Reference
  • Cohen, J. (1988). Statistical power analysis for
    the behavioral sciences (2nd ed.). Hillsdale, NJ
    Erlbaum.

29
10. Believing that fancy statistics will
compensate for lousy measures or a poor design.
30
(No Transcript)
31
Designing Effective Research
  • Identify interesting problem
  • Formulate key research questions
  • Derive hypotheses
  • Choose/develop measures (operationalise
    constructs)
  • Design research strategy
  • Outline analysis approach
Write a Comment
User Comments (0)
About PowerShow.com