Measurement - PowerPoint PPT Presentation

About This Presentation
Title:

Measurement

Description:

Title: Slide 1 Author: kayser Last modified by: Richard Niemi Created Date: 1/19/2004 1:29:02 AM Document presentation format: On-screen Show Company – PowerPoint PPT presentation

Number of Views:59
Avg rating:3.0/5.0
Slides: 33
Provided by: Kay83
Category:

less

Transcript and Presenter's Notes

Title: Measurement


1
Measurement
  • The hardest part of doing research?
  • Youll see when we begin
  • operationalizing concepts
  • May seem easy/trivial/even boring,
  • but it is crucial
  • Most important part of research?
  • Fancy statistics on poor
  • measurements are a problem.

2
The Measurement Process Operationalization----
--------------------------------------------------
--
  • Concept
  • ?
  • Conceptual Definition
  • ?
  • Operational Definition
  • ?
  • Variable

3
Concepts are vague
  • Empirical political research analyzes concepts
    and the relationship between them but what is
  • Education?
  • Feminism?
  • Globalization?
  • Liberalism?
  • Democracy?

4
  • Even easier concepts may be hard to
  • define
  • Partisanship of voters
  • Number of political parties in a country
  • Political tolerance

5
Conceptual Definitionproperties and subjects
  • Must communicate three things
  • The variation within a characteristic
  • The subject or groups to which the concept
    applies
  • How the characteristic is to be measured
  • E.g. The concept of ______ is defined as the
    extent to which _____ exhibits the characteristic
    of ______.
  • Try tolerance, democracy, capitalism,
    liberalism, etc.

6
Operational Definition How does one measure the
concept?
  • Critical/necessary step for analysis to be
    possible
  • Toughest part
  • One needs to be very specific
  • Easiest to criticize
  • Almost always problems/exceptions
  • Need to defend measures thoroughly

7
OperationalizationA simple example
  • Education (how well individuals are educated)
  • How might we measure it?
  • Problems with possible definitions?
  • What operationalization is actually used?

8
  • Advantages?
  • Simple to use
  • Seems right in most instances
  • Almost impossible to think of a better
  • measure
  • Disadvantages
  • Some examples are problematic

9
OperationalizationA more difficult example
  • Peoples political partisanship
  • Conceptual definition how people feel
    about
  • the Democratic v. the Republican
    party (or
  • loyalty to the parties, or party
    attachments)
  • How might we measure it?
  • Problems with possible definitions?
  • What operationalization is actually used?

10
  • Advantages?
  • Applies to voters and nonvoters alike
  • Avoids problems of which elections to
  • use, etc.
  • Notion of deviating from ID is useful
  • As often asked, provides strength of ID
  • as well as direction

11
  • Disadvantages?
  • The leaner problem (text, p. 17)
  • It doesnt travel well.
  • A point about its use
  • You see it a lot in the media
  • E.g., did Bush win over Dems?
  • How men and women differ on ID?
  • Has the of inds increased?

12
OperationalizationA deceptively hard example
  • Number of political parties in a country
  • Appears easy any problems with it?
  • What operationalization is actually used?

13
  • Advantages?
  • The way it deals with small parties
  • Disadvantages
  • Some examples are problematic
  • A point about its use
  • How good it is may depend on what it
  • is used for
  • A conceptual question again

14
Reliability and validity
  • How well does an operationalization work?
  • Begin (see text, p. 14) by defining
  • Measurement Intended characteristic
  • Systematic error Random error.
  • Usually judged by assessing
  • Validity
  • Reliability

15
Validity
  • Definition is easy
  • Does a measure gauge (or, measure)
  • the intended characteristic and only
  • that characteristic
  • But it is difficult to apply
  • How do we know what is being measured?
  • Refers to problems of systematic error
  • But saying that doesnt help a whole lot

16
Validity tests
  • Face validity does the measure look like it
    measures what its supposed to?
  • Occasionally usefulat least if a
  • measure does not pass this test.
  • Usually no explicit tests are made to
  • determine face validity, but the
    term
  • is used loosely (Shull
  • Vanderleeuw)

17
  • Construct validity concerned with the
    relationship of a given measure with other
    measurese.g., is the SAT a good predictor of
    success in college?
  • Useful to a degree
  • But how strong a relationship is
  • required?
  • Other, related tests (content, criterion-related
    validity) are similar

18
An aside on Hawthorne effects
  • Effects that are a result of individuals
    awareness that they are being tested
  • Origin in an industrial study
  • Very important in experiments
  • Disguising the purpose of an
  • experiment helps
  • Analogous impact in psc is in surveys
  • E.g., survey on elections makes people
    more
  • attentive to them, more likely to
    vote

19
Reliability
  • A measure is reliable to the extent that it is
    consistenti.e., there is no random error
  • Scale, or guns, are good examples
  • Note Reliability ? Validity
  • Random Error (noise), never without
  • Unlike with validity, there are tests
  • of reliability

20
Evaluating Reliability
  • Four methods (two mentioned in text)
  • Test-retest method. Problem
  • learning effect
  • Alternative forms. Problem equivalent
  • forms?
  • Split-half method. Problem multiple
  • halves
  • Internal consistency. Generalization of
    split
  • half. Best most often used

21
  • Reliability methods
  • All rely on correlations (later in
    course)
  • Best internal consistency method
  • averages all split-half
    correlations
  • This method is called alpha. Simple
  • formula you can learn if you need
    to
  • (Varies between 0 and 1.)

22
  • Validity/reliability concepts apply not just to
    tests or survey items. Think about
  • Profit as measure of CEO ability
  • Gun registrations as measure of gun
  • ownership
  • Reported crimes as a measure of the
  • crime rate
  • Even hard data can be invalid/unreliable

23
A real-world example
  • Interesting, important concept support for
    democracy
  • Conceptual definition how much people in various
    countries say they support (or prefer, or would
    like) a democratic government.
  • Operationalization (survey) Agree or disagree
    Democracy has its problems, but its better than
    any other form of government.

24
  • Surveys have often found high levels of support
    for democracy using this kind of measure
  • Question is this a valid measure of support for
    democracy?

25
Variables
  • Actual measurement of the concept
  • Variable name v. variables values
  • As long as you remember this
  • distinction, you shouldnt have a
  • problem
  • Examples
  • Religion (Protestant, Catholic, Jewish,
    etc.)
  • Height (values in feet and inches)

26
Variables (cont.)
  • Residual categories--a small, but often nagging
    point
  • Cases (respondents, counties,
  • countries, etc.) for which the data is
  • missing
  • Well deal these laterjust note the
  • problem here

27
Levels of Measurement
  • Nominal (least precise) categorical
  • E.g. Protestant, Catholic, Jewish, Atheist
  • Ordinal relative difference (higher/lower
    for/against)
  • E.g. support, neutral, oppose
  • Interval (most precise) exact difference in
    units
  • Common in Aggregate Data turnout, budget, GDP,
    numbers of members, deaths in war
  • Less common in individual level data.
  • non-quantifiable (religion, region, etc.)
  • no agreed-upon scale (happiness, tolerance)

28
Levels (cont.)
  • In practice, the distinction is not always
    observed.
  • Well see that later on.
  • Note that level of measurement and
  • reliability are not the same thing
  • Interval-level data can be unreliable and invalid
    (crime rates?)

29
Unit of analysis
  • The entity we are describing
  • Individualwe mean individual people
  • Aggregateany grouping of
  • individuals
  • Often, a single concept can be studied at
    multiple levels
  • Example professionalization of state
  • legislators

30
Unit of analysis (cont.)
  • May want to measure and explain why some
    individual legislators show more signs of
    professionalization
  • May want to measure and explain why legislatures
    in some states are more professionalized

31
Unit of analysis
  • Unit of analysis
  • individual or aggregate?
  • Ecological fallacy inference about individuals
    based on
  • aggregate data
  • E.g., concluding from aggregate data here that
    religious inds are tolerant


Individual Religious? Tolerant?
A Yes Yes
B Yes No
C No Yes
D No No

Aggregate 2 Y2 N 2 Y2 N
32
Identify the unit of analysis and level of
measurement
  • Gender (Individual 1 F Individual 2 M)
  • Budget (County 1 3.2 million County 2 58.1
    million)
  • Tolerance (Individual 1 highly intolerant Indiv
    2 neutral)
  • Support for Gay Marriage (Sweden 67 Spain
    29)
  • Electoral system (country 1 PR country 2
    Plurality)
Write a Comment
User Comments (0)
About PowerShow.com