Experimental Methods 626 PowerPoint PPT Presentation

presentation player overlay
1 / 16
About This Presentation
Transcript and Presenter's Notes

Title: Experimental Methods 626


1
Experimental Methods 626
  • Science, Theory, Research Application

2
Objectives
  • Recall goals of science
  • Describe role of theory
  • Recall types of theories
  • Recall criteria for evaluating theory

3
Some questions
  • What is science?
  • What is purpose of science?
  • Is there an omni-everything God?
  • What is the relationship between religion and
    science?
  • Is there free will?
  • Is there a reality?
  • Is science the description of that reality?
  • What is the essential tension?

4
Goals of Science
5
Role of theory (nothing so practical as a good
theory)
  • Identify predictors
  • Make predictions
  • Identify interventions
  • Identify transference of interventions (e.g.,
    antibodies)
  • Heuristic value serving to discover or to
    stimulate investigation

6
The problem-solving paradigm
  • identify problem (discrepancy from some goal)
  • hypothesize source of problem
  • generate alternative solutions
  • evaluate alternatives (compare anticipated
    results if solution adopted)
  • pick alternative
  • implement alternative
  • Compare results with goal, revisit if discrepancy
    persists or arises again

7
Role of research
  • Used to test theories
  • Used to confirm effectiveness of
    interventions/predictions

8
Types of theories
  • Extrapolation (folk remedies)
  • intelligence as predictor of school performance.
  • e.g., sleep deprivation as predictor of mono
  • Abstraction (constructs and relationships)
  • one level of explanation down from extrapolation
  • e.g., intelligence as construct, self-esteem
    Extrapolations
  • e.g., immune compromising interventions effect on
    mono
  • Models
  • Representation of how physical or function
    components fix together
  • e.g., intelligence as organization of brain . . .
  • e.g., germ theory of mono
  • Typology or taxonomy (e.g., periodic table of
    elements)

9
Criteria for evaluating theory (all relative to
competition)
10
Criteria (contd) and Epistemological paradigms
11
All knowledge is tentative
  • Alternative explanations !!!
  • General
  • Chance
  • Reverse causality
  • Demand characteristics
  • Etc.
  • Specific
  • Two or more theories can explain data

12
Validity of interpretations of results (not of
study)
  • Construct validity
  • Measuring/manipulating less than you think
  • Measuring/manipulating more than you think
  • I.e., confounding
  • Can it be measured/manipulated?
  • Statistical conclusion validity
  • Chance or misapplication of analysis are sources
    of alternative explanation (generally subject of
    statistics classes)

13
Internal validity
  • Extent causal inferences can be drawn
  • _____________
  • _________________
  • No alternative explanations
  • Reverse causality (problem with passive
    observation designs if weak theory)
  • 3rd variable (confounds w/ manipulations,
    unmeasured variables)
  • Alternative conceptualizations of cause
  • e.g., horse and fence
  • e.g., open v. closed

14
External validity
  • Degree findings can generalize across
  • Samples/Populations
  • Contexts
  • Behaviors
  • Time
  • Operationalizations of variables
  • (overlaps construct validity)

15
Role of application
  • Its the point, stupid
  • Prediction
  • At this point cause does not matter
  • Program evaluation research
  • Cause is everything
  • Should we put our money in this again? On a
    grander scale?
  • Does less work?
  • Generally use kitchen sink method, therefore
    confounding is a problem

16
Next class
  • What are the techniques of the social sciences?
Write a Comment
User Comments (0)
About PowerShow.com