Experiment Basics: Variables - PowerPoint PPT Presentation

About This Presentation
Title:

Experiment Basics: Variables

Description:

Experiment Basics: Variables Psych 231: Research Methods in Psychology ... – PowerPoint PPT presentation

Number of Views:157
Avg rating:3.0/5.0
Slides: 63
Provided by: psychology66
Category:

less

Transcript and Presenter's Notes

Title: Experiment Basics: Variables


1
Experiment Basics Variables
  • Psych 231 Research Methods in Psychology

2
Exam 1
  • Results
  • Mean 81.7
  • Range 60-96
  • If you want to go over your exam set up a time to
    see me

3
So you want to do an experiment?
4
So you want to do an experiment?
  • Youve got your theory.
  • What behavior you want to examine
  • Identified what things (variables) you think
    affects that behavior

5
So you want to do an experiment?
  • Youve got your theory.
  • Next you need to derive predictions from the
    theory.
  • These should be stated as hypotheses.
  • In terms of conceptual variables or constructs
  • Conceptual variables are abstract theoretical
    entities
  • Consider our class experiment
  • Hypotheses
  • What you try to memorize how you try to
    memorize it will impact memory performance.

6
So you want to do an experiment?
  • Youve got your theory.
  • Next you need to derive predictions from the
    theory.
  • Now you need to design the experiment.
  • You need to operationalize your variables in
    terms of how they will be
  • Manipulated
  • Measured
  • Controlled
  • Be aware of the underlying assumptions connecting
    your constructs to your operational variables
  • Be prepared to justify all of your choices

7
Constants vs. Variables
  • Characteristics of the psychological situations
  • Constants have the same value for all
    individuals in the situation
  • Variables have potentially different values for
    each individual in the situation
  • Variables in our experiment
  • Levels of processing
  • Type of words
  • Memory performance
  • time for recall
  • kind of filler task given
  • pacing of reading the words on the list

8
Variables
  • Conceptual vs. Operational
  • Conceptual variables (constructs) are abstract
    theoretical entities
  • Operational variables are defined in terms within
    the experiment. They are concrete so that they
    can be measured or manipulated

Conceptual How we memorize (Levels of
processing) Kinds of things Memory
Operational Has an a Related to ISU Words
rated as abstract or concrete Memory test
9
Many kinds of Variables
  • Independent variables (explanatory)
  • Dependent variables (response)
  • Extraneous variables
  • Control variables
  • Random variables
  • Confound variables

10
Many kinds of Variables
  • Independent variables (explanatory)
  • Dependent variables (response)
  • Extraneous variables
  • Control variables
  • Random variables
  • Confound variables

11
Independent Variables
  • The variables that are manipulated by the
    experimenter (sometimes called factors)
  • Each IV must have at least two levels
  • Remember the point of an experiment is comparison
  • Combination of all the levels of all of the IVs
    results in the different conditions in an
    experiment

12
Independent Variables
  • 1 factor, 2 levels

1 factor, 3 levels
2 factors, 2 x 3 levels
13
Manipulating your independent variable
  • Methods of manipulation
  • Straightforward
  • Stimulus manipulation - different conditions use
    different stimuli
  • Instructional manipulation different groups are
    given different instructions
  • Staged
  • Event manipulation manipulate characteristics
    of the context, setting, etc.
  • Subject (Participant) there are (pre-existing
    mostly) differences between the subjects in the
    different conditions
  • leads to a quasi-experiment

Abstract vs. concrete words
Has an a vs. ISU related
14
Choosing your independent variable
  • Choosing the right levels of your independent
    variable
  • Review the literature
  • Do a pilot experiment
  • Consider the costs, your resources, your
    limitations
  • Be realistic
  • Pick levels found in the real world
  • Pay attention to the range of the levels
  • Pick a large enough range to show the effect
  • Aim for the middle of the range

15
Identifying potential problems
  • These are things that you want to try to avoid by
    careful selection of the levels of your IV (may
    be issues for your DV as well).
  • Demand characteristics
  • Experimenter bias
  • Reactivity
  • Floor and ceiling effects

16
Demand characteristics
  • Characteristics of the study that may give away
    the purpose of the experiment
  • May influence how the participants behave in the
    study
  • Examples
  • Experiment title The effects of horror movies on
    mood
  • Obvious manipulation Ten psychology students
    looking straight up
  • Biased or leading questions Dont you think its
    bad to murder unborn children?

17
Experimenter Bias
  • Experimenter bias (expectancy effects)
  • The experimenter may influence the results
    (intentionally and unintentionally)
  • E.g., Clever Hans
  • One solution is to keep the experimenter (as well
    as the participants) blind as to what
    conditions are being tested

18
  • Knowing that you are being measured
  • Just being in an experimental setting, people
    dont always respond the way that they normally
    would.
  • Cooperative
  • Defensive
  • Non-cooperative

Reactivity
19
Floor effects
  • A value below which a response cannot be made
  • As a result the effects of your IV (if there are
    indeed any) cant be seen.
  • Imagine a task that is so difficult, that none of
    your participants can do it.

20
Ceiling effects
  • When the dependent variable reaches a level that
    cannot be exceeded
  • So while there may be an effect of the IV, that
    effect cant be seen because everybody has maxed
    out
  • Imagine a task that is so easy, that everybody
    scores a 100
  • To avoid floor and ceiling effects you want to
    pick levels of your IV that result in middle
    level performance in your DV

21
Variables
  • Independent variables (explanatory)
  • Dependent variables (response)
  • Extraneous variables
  • Control variables
  • Random variables
  • Confound variables

22
Dependent Variables
  • The variables that are measured by the
    experimenter
  • They are dependent on the independent variables
    (if there is a relationship between the IV and DV
    as the hypothesis predicts).
  • Consider our class experiment
  • Conceptual level Memory
  • Operational level Recall test
  • Present list of words, participants make a
    judgment for each word
  • 15 sec. of filler (counting backwards by 3s)
  • Measure the accuracy of recall

23
Choosing your dependent variable
  • How to measure your your construct
  • Can the participant provide self-report?
  • Introspection specially trained observers of
    their own thought processes, method fell out of
    favor in early 1900s
  • Rating scales strongly agree-agree-undecided-di
    sagree-strongly disagree
  • Is the dependent variable directly observable?
  • Choice/decision (sometimes timed)
  • Is the dependent variable indirectly observable?
  • Physiological measures (e.g. GSR, heart rate)
  • Behavioral measures (e.g. speed, accuracy)

24
Measuring your dependent variables
  • Scales of measurement
  • Errors in measurement

25
Measuring your dependent variables
  • Scales of measurement
  • Errors in measurement

26
Measuring your dependent variables
  • Scales of measurement - the correspondence
    between the numbers representing the properties
    that were measuring
  • The scale that you use will (partially) determine
    what kinds of statistical analyses you can perform

27
Scales of measurement
  • Categorical variables
  • Quantitative variables
  • Nominal scale

28
Scales of measurement
  • Nominal Scale Consists of a set of categories
    that have different names.
  • Label and categorize observations,
  • Do not make any quantitative distinctions between
    observations.
  • Example
  • Eye color

29
Scales of measurement
  • Categorical variables
  • Nominal scale
  • Ordinal scale
  • Quantitative variables

30
Scales of measurement
  • Ordinal Scale Consists of a set of categories
    that are organized in an ordered sequence.
  • Rank observations in terms of size or magnitude.
  • Example
  • T-shirt size

31
Scales of measurement
  • Categorical variables
  • Nominal scale
  • Ordinal scale
  • Quantitative variables
  • Interval scale

32
Scales of measurement
  • Interval Scale Consists of ordered categories
    where all of the categories are intervals of
    exactly the same size.
  • With an interval scale, equal differences between
    numbers on the scale reflect equal differences in
    magnitude.
  • Ratios of magnitudes are not meaningful.
  • Example Fahrenheit temperature scale

20º
40º
Not Twice as hot
33
Scales of measurement
  • Categorical variables
  • Nominal scale
  • Ordinal scale
  • Quantitative variables
  • Interval scale
  • Ratio scale

34
Scales of measurement
  • Ratio scale An interval scale with the
    additional feature of an absolute zero point.
  • Ratios of numbers DO reflect ratios of magnitude.
  • It is easy to get ratio and interval scales
    confused
  • Example Measuring your height with playing cards

35
Scales of measurement
Ratio scale
8 cards high
36
Scales of measurement
Interval scale
5 cards high
37
Scales of measurement
Interval scale
Ratio scale
8 cards high
5 cards high
0 cards high means as tall as the table
0 cards high means no height
38
Scales of measurement
  • Categorical variables
  • Nominal scale
  • Ordinal scale
  • Quantitative variables
  • Interval scale
  • Ratio scale

Best Scale?
  • Given a choice, usually prefer highest level of
    measurement possible

39
Measuring your dependent variables
  • Scales of measurement
  • Errors in measurement
  • Reliability Validity

40
Example Measuring intelligence?
  • How do we measure the construct?
  • How good is our measure?
  • How does it compare to other measures of the
    construct?
  • Is it a self-consistent measure?

Reliability Validity
41
Errors in measurement
  • Reliability
  • If you measure the same thing twice (or have two
    measures of the same thing) do you get the same
    values?
  • Validity
  • Does your measure really measure what it is
    supposed to measure?
  • Does our measure really measure the construct?
  • Is there bias in our measurement?

42
Reliability Validity
Reliability consistency Validity measuring
what is intended
reliablevalid
reliable invalid
unreliable invalid
43
Reliability
  • True score measurement error
  • A reliable measure will have a small amount of
    error
  • Multiple kinds of reliability

44
Reliability
  • Test-restest reliability
  • Test the same participants more than once
  • Measurement from the same person at two different
    times
  • Should be consistent across different
    administrations

Reliable
Unreliable
45
Reliability
  • Internal consistency reliability
  • Multiple items testing the same construct
  • Extent to which scores on the items of a measure
    correlate with each other
  • Cronbachs alpha (a)
  • Split-half reliability
  • Correlation of score on one half of the measure
    with the other half (randomly determined)

46
Reliability
  • Inter-rater reliability
  • At least 2 raters observe behavior
  • Extent to which raters agree in their
    observations
  • Are the raters consistent?
  • Requires some training in judgment

47
Validity
  • Does your measure really measure what it is
    supposed to measure?
  • There are many kinds of validity

48
VALIDITY
CONSTRUCT
INTERNAL
EXTERNAL
CRITERION- ORIENTED
FACE
CONVERGENT
PREDICTIVE
DISCRIMINANT
CONCURRENT
Many kinds of Validity
49
VALIDITY
CONSTRUCT
INTERNAL
EXTERNAL
CRITERION- ORIENTED
FACE
CONVERGENT
PREDICTIVE
DISCRIMINANT
CONCURRENT
Many kinds of Validity
50
Construct Validity
  • Usually requires multiple studies, a large body
    of evidence that supports the claim that the
    measure really tests the construct

51
Face Validity
  • At the surface level, does it look as if the
    measure is testing the construct?

This guy seems smart to me, and he got a high
score on my IQ measure.
52
External Validity
  • Are experiments real life behavioral
    situations, or does the process of control put
    too much limitation on the way things really
    work?

53
External Validity
  • Variable representativeness
  • Relevant variables for the behavior studied along
    which the sample may vary
  • Subject representativeness
  • Characteristics of sample and target population
    along these relevant variables
  • Setting representativeness
  • Ecological validity - are the properties of the
    research setting similar to those outside the lab

54
Internal Validity
  • The precision of the results
  • Did the change result from the changes in the DV
    or does it come from something else?

55
Threats to internal validity
  • History an event happens the experiment
  • Maturation participants get older (and other
    changes)
  • Selection nonrandom selection may lead to
    biases
  • Mortality participants drop out or cant
    continue
  • Testing being in the study actually influences
    how the participants respond

56
Variables
  • Independent variables (explanatory)
  • Dependent variables (response)
  • Extraneous variables
  • Control variables
  • Random variables
  • Confound variables

57
Control your extraneous variable(s)
  • Can you keep them constant?
  • Should you make them random variables?

58
Extraneous Variables
  • Control variables
  • Holding things constant - Controls for excessive
    random variability
  • 90 seconds for recall
  • 15 seconds of counting backwards by 3s

59
Extraneous Variables
  • Random variables may freely vary, to spread
    variability equally across all experimental
    conditions
  • Randomization
  • A procedures that assure that each level of an
    extraneous variable has an equal chance of
    occurring in all conditions of observation.
  • On average, the extraneous variable is not
    confounded with our manipulated variable.
  • random order of word presentation
  • time of day administered
  • what they ate that day
  • when they woke up

60
Confound Variables
  • Confound variables
  • Other variables, that havent been accounted for
    (manipulated, measured, randomized, controlled)
    that can impact changes in the dependent
    variable(s)

61
Confound Variables
  • Confound variables
  • Other variables, that havent been accounted for
    (manipulated, measured, randomized, controlled)
    that can impact changes in the dependent
    variable(s)

62
Debugging your study
  • Pilot studies
  • A trial run through
  • Dont plan to publish these results, just try out
    the methods
  • Manipulation checks
  • An attempt to directly measure whether the IV
    variable really affects the DV.
  • Look for correlations with other measures of the
    desired effects.
Write a Comment
User Comments (0)
About PowerShow.com