Contextual Vocabulary Acquisition: From Algorithm to Curriculum - PowerPoint PPT Presentation

About This Presentation
Title:

Contextual Vocabulary Acquisition: From Algorithm to Curriculum

Description:

... From Algorithm to Curriculum Computational theory of CVA Based on: ... L1 & L2 acquisition research Computational lexicography ** Education: ... – PowerPoint PPT presentation

Number of Views:300
Avg rating:3.0/5.0
Slides: 177
Provided by: Michael1943
Learn more at: https://cse.buffalo.edu
Category:

less

Transcript and Presenter's Notes

Title: Contextual Vocabulary Acquisition: From Algorithm to Curriculum


1
Contextual Vocabulary AcquisitionFrom Algorithm
to Curriculum
  • Michael W. Kibby, Ph.D.
  • Department of Learning Instruction and The
    Reading Center
  • William J. Rapaport, Ph.D.
  • Department of Computer Science Engineering
  • Department of Philosophy, and Center for
    Cognitive Science
  • Karen M. Wieland
  • Department of Learning Instruction , The
    Reading Center,
  • and The Nichols School
  • NSF ROLE Grant REC-0106338

2
(No Transcript)
3
(No Transcript)
4
Why Learning Word Meanings Is Important
5
Why Learning Word Meanings Is Important Reason 1
National Assessment of Educational
Progress-Reading(NAEP-Reading)
6
Meaning Vocabulary Assessment on NAEP-R
  • Meaning vocabulary is the application of ones
    understanding of word meanings to passage
    comprehension.

7
  • Vocabulary knowledge is considered to be one of
    the five essential components of reading as
    defined by the No Child Left Behind legislation.

8
  • NAEP will not test definitions in isolation from
    surrounding text i.e., students will not be
    assessed on their prior knowledge of definitions
    of words on a list.

9
(No Transcript)
10
Examples
  • Altruistic
  • Magnanimously
  • Dispersed
  • Impetus
  • Forage
  • Soothing
  • Lost in thought
  • Huddled
  • Abide
  • Piqued
  • Beholden
  • Marathon journey
  • Legacy
  • Abated
  • Social contract
  • Grudge

11
Three Reasons NAEP-R Does Not Test a Specific
Word List
  1. Knowledge of the explicit definition of a word is
    not what is required for reading comprehension.

12
  • The meaning of a word is too often dependent upon
    the context.
  • e.g. cast
  • The fisherman cast his line.
  • The members of the cast took a bow.
  • They put a cast on my broken arm.
  • The yard is littered with shells cast off by the
    cicadas.

13
  1. Writers often use words in a manner that goes
    beyond their concrete, familiar definition, but
    do so in ways that skilled readers can interpret
    effectively.

14
Why Learning Word Meanings Is ImportantReason 2
  • Learning new things and their words changes or
    increases our perception and organization of the
    world

15
The Lego Notion of Learning New Things
16
Why Learning Word Meanings Is Important Reason 3
  • Reading comprehension mandates knowing the
    meaning (i.e., concept, thing) associated with
    words in the text
  • When students do not know meanings of words in a
    written text, comprehension often decreases.

17
(No Transcript)
18
Why Learning Word Meanings Is Important Reason 4
  • Learning new things and words facilitates
    students abilities to use words judiciously
    which is much valued in our society

19
(No Transcript)
20
(No Transcript)
21
Why Learning Word Meanings Is Important Reason 5
  • The Profound Effects of Limited Vocabulary

22
Profound effects of limited vocabulary continued
  • Limited vocabulary is associated with lower IQ
    scores.
  • Limited vocabulary is associated with limited
    reading comprehension.
  • In grades 7, vocabulary and reading
    comprehension correlate .75 to .85.

23
Social Class and Meaning Vocabulary
  • Hart, Betty, Risley, Todd R. (1995). Meaningful
    differences in the everyday experience of young
    children. Baltimore, MD Brookes.

24
  • Studied 42 childrens vocabulary growth from
    their 9th month to their 36th month.
  • Researchers
  • Visited each childs home once a month.
  • Observed and tape recorded for one hour every
    word spoken to or by child.
  • Recorded 23-30 hours for every child.

25
Actual and Estimated Number of Words Heard from 0
- 48 Months
26
The Invisible Curriculum
27
Cumulative Number of New Words (Hart Risley,
1995)
28
A Brief Background on the Counting of Words
  • Carroll, Davies, Richman (1971), The American
    Heritage Word Frequency Bookcalled the WFI.
  • A count of 5,088,721 different words (called
    tokens) in printed English for grades 3-9.

29
Of 5,088,721 Words in WFI
  • There were 86,741 different words.
  • But the following 13 were counted as different
    words
  • add additive additives
  • adds addition additions
  • added addend addends
  • adding additional ADDITION
  • as well as Add (capitalized).

30
When Do Two Words Differ?
  • Nagy Anderson sampled WFI words.
  • Put each word in 1 of 6 classes varying in
    semantic relation to other words.
  • Classes 0, 1, 2 closely related semantically.
  • Classes 3, 4, 5 progressively more distant.
  • Estimated there are 139,020 different words in
    semantic categories 0, 1, 2.

31
  • But 45,453 of these are base words knowing these
    45,453 means a reader knows all 139,020.
  • Adding 43,080 in classes 3, 4 5 brings total to
    88,583 different word families in printed school
    texts for grades 3-9.

32
(No Transcript)
33
Learning New Words is Natural
34
(No Transcript)
35
Edna Heidbretter, The Attainment of Concepts. 1946
  • taught persons to associate nine pairs of visual
    shapes and pronounceable pseudo word
  • told persons this was a memory task

36
pran
37
mulf
38
. . . . . . . . . . . . . . .
relk
39
Test
40
SETS
I
pran
mulf
. . . . . . . . . . . . . . .
relk
Trials to Learn
27
41
SETS
I
II
pran
mulf
. . . . . . . . . . . . . . .
relk
Trials to Learn
27
32
42
SETS
I
II
III
pran
mulf
. . . . . . . . . . . . . . .
relk
Trials to Learn
27
32
11
43
SETS
I
II
III
IV
pran
mulf
. . . . . . . . . . . . . . .
relk
Trials to Learn
27
32
11
4
44
SETS
I
II
III
IV
V
pran
mulf
. . . . . . . . . . . . . . .
relk
Trials to Learn
27
32
11
4
1.5
45
SETS
I
II
III
IV
V
pran
mulf
. . . . . . . . . . . . . . .
relk
Trials to Learn
27
32
11
4
1.5
46
Definition of CVA
  • Contextual Vocabulary Acquisition def
  • the acquisition of word meanings from text
  • incidental
  • deliberate
  • by reasoning about
  • contextual clues
  • background knowledge (linguistic, factual,
    commonsense)
  • Including hypotheses from prior encounters (if
    any) with the word
  • without external sources of help
  • No dictionaries
  • No people

47
CVA From Algorithm to Curriculum
  • Computational theory of CVA
  • Based on
  • algorithms developed by Karen Ehrlich (1995)
  • verbal protocols (case studies)
  • Implemented in a semantic-network-based
  • knowledge-representation reasoning system
  • SNePS (Stuart C. Shapiro colleagues)
  • Educational curriculum to teach CVA
  • Based on our algorithms protocols
  • To improve vocabulary reading comprehension
  • Joint work with Michael Kibby Karen Wieland
  • Center for Literacy Reading
    Instruction

48
Project Goals
  • Develop implement computational theory of CVA
    based on verbal protocols (case studies)
  • Translate algorithms into a curriculum
  • To improve CVA and reading comprehension in
    science, technology, engineering, math (STEM)
  • Use new case studies, based on the curriculum,
    to improve both algorithms curriculum

49
People Do Incidental CVA
  • We know more words than explicitly taught
  • Average high-school grad knows 45K words
  • ? learned 2.5K words/year (over 18 yrs.)
  • But only taught 400/school-year
  • 4800 in 12 years of school ( 10 of total)
  • Most word meanings learned from context
  • - including oral perceptual contexts
  • incidentally (unconsciously)
  • How?

50
People Also Do Deliberate CVA
  • Youre reading
  • You understand everything you read, until
  • You come across a new word
  • Not in dictionary
  • No one to ask
  • So, you try to figure out its meaning from
    context
  • How?
  • guess? derive? infer? deduce? educe? construct?
    predict?
  • our answer
  • Compute it from inferential search of context,
    including background knowledge

51
What does brachet mean?

52
(From Malorys Morte DArthur page in
brackets)
  • 1. There came a white hart running into the
    hall with a white brachet next to him, and thirty
    couples of black hounds came running after them.
    66
  • As the hart went by the sideboard, the white
    brachet bit him. 66
  • The knight arose, took up the brachet and rode
    away with the brachet. 66
  • A lady came in and cried aloud to King Arthur,
    Sire, the brachet is mine. 66
  • There was the white brachet which bayed at him
    fast. 72
  • 18. The hart lay dead a brachet was biting on
    his throat, and other hounds came behind. 86

53
Computational cognitive theory of how to learn
word meanings
  • From context
  • I.e., text grammatical info readers prior
    knowledge
  • With no external sources (human, on-line)
  • Unavailable, incomplete, or misleading
  • Domain-independent
  • But more prior domain-knowledge yields better
    definitions
  • definition hypothesis about words meaning
  • Revisable each time word is seen

54
Cassie learns what brachet meansBackground
info about harts, animals, King Arthur, etc.No
info about brachetsInput formal-language
(SNePS) version of simplified EnglishA hart
runs into King Arthurs hall. In the story, B12
is a hart. In the story, B13 is a hall. In
the story, B13 is King Arthurs. In the story,
B12 runs into B13.A white brachet is next to
the hart. In the story, B14 is a brachet. In
the story, B14 has the property white.
Therefore, brachets are physical objects.
(deduced while reading Cassie believes that
only physical objects have color)
55
--gt (defineNoun "brachet") Definition of
brachet Class Inclusions phys obj, Possible
Properties white, Possibly Similar Items
animal, mammal, deer, horse, pony, dog,
I.e., a brachet is a physical object that can be
white and that might be like an animal,
mammal, deer, horse, pony, or dog
56
A hart runs into King Arthurs hall.A white
brachet is next to the hart.The brachet bites
the harts buttock.--gt (defineNoun "brachet")
Definition of brachet Class Inclusions
animal, Possible Actions bite buttock,
Possible Properties white, Possibly Similar
Items mammal, pony,
57
A hart runs into King Arthurs hall. A white
brachet is next to the hart. The brachet bites
the harts buttock. The knight picks up the
brachet. The knight carries the brachet. --gt
(defineNoun "brachet") Definition of brachet
Class Inclusions animal, Possible Actions
bite buttock, Possible Properties small,
white, Possibly Similar Items mammal, pony,
58
A hart runs into King Arthurs hall.A white
brachet is next to the hart.The brachet bites
the harts buttock.The knight picks up the
brachet.The knight carries the brachet.The lady
says that she wants the brachet.--gt
(defineNoun "brachet") Definition of brachet
Class Inclusions animal, Possible Actions
bite buttock, Possible Properties valuable,
small, white, Possibly Similar Items
mammal, pony,
59
  • A hart runs into King Arthurs hall.A white
    brachet is next to the hart.The brachet bites
    the harts buttock.The knight picks up the
    brachet.The knight carries the brachet.The lady
    says that she wants the brachet.
  • The brachet bays at Sir Tor. background
    knowledge only hunting dogs bay
  • --gt (defineNoun "brachet")
  • Definition of brachet
  • Class Inclusions hound, dog,
  • Possible Actions bite buttock, bay, hunt,
  • Possible Properties valuable, small, white,
  • I.e. A brachet is a hound (a kind of dog) that
    can bite, bay, and hunt,
  • and that may be valuable, small, and white.

60
General Comments
  • Systems behavior ? human protocols
  • Systems definition ? OEDs definition
  • A brachet is a kind of hound which hunts by
    scent

61
Computational cognitive theory of how to learn
word meanings from context (cont.)
  • 3 kinds of vocabulary acquisition
  • Construct new definition of unknown word
  • What does brachet mean?
  • Fully revise definition of misunderstood word
  • Does smiting entail killing?
  • Expand definition of word used in new sense
  • Can you dress (i.e., clothe) a spear?
  • Initial hypothesis
  • Revision(s) upon further encounter(s)
  • Converges to stable, dictionary-like
    definition
  • Subject to revision

62
Motivations Applications
  • Part of cognitive-science projects
  • Narrative text understanding
  • Syntactic semantics (contra Searles Chinese-Room
    Argument)
  • Computational applications
  • Information extraction
  • Autonomous intelligent agents
  • There can be no complete lexicon
  • Agent/info-extraction system shouldnt have to
    stop to ask questions
  • Other applications
  • L1 L2 acquisition research
  • Computational lexicography
  • Education improve reading comprehension

63
State of the Art
  • Vocabulary Learning
  • Some dubious contributions
  • Useless algorithms
  • Contexts that include definition
  • Useful contribution
  • (good) readers word-model
  • updateable frame with slots defaults
  • Psychology
  • Cues to look for ( slots for frame)
  • Space, time, value, properties, functions,
    causes, classes, synonyms, antonyms
  • Can understand a word w/o having a definition
  • Computational Linguistics
  • Systems need scripts, human informants,
    ontologies
  • Not needed in our system
  • CVA ? Word-Sense Disambiguation
  • Essay question vs. multiple-choice test

64
State of the Art Computational Linguistics
  • Information extraction systems
  • Autonomous intelligent agents
  • There can be no complete lexicon
  • Such systems/agents shouldnt have to stop to ask
    questions

65
State of the Art Computational Linguistics
  • Granger 1977 Foul-Up
  • Based on Schanks theory of scripts (schema
    theory)
  • Our system not restricted to scripts
  • Zernik 1987 self-extending phrasal lexicon
  • Uses human informant
  • Ours system is really self-extending
  • Hastings 1994 Camille
  • Maps unknown word to known concept in ontology
  • Our system can learn new concepts
  • Word-Sense Disambiguation
  • Given ambiguous word list of all meanings,
    determine the correct meaning
  • Multiple-choice test ?
  • Our system given new word, compute its meaning
  • Essay question ?

66
State of the Art Vocabulary Learning (I)
  • Elshout-Mohr/van Daalen-Kapteijns 1981,1987
  • Application of Winstons AI arch learning
    theory
  • (Good) readers model of new word frame
  • Attribute slots, default values
  • Revision by updating slots values
  • Poor readers update by replacing entire frames
  • But EM vDK used
  • Made-up words
  • Carefully constructed contexts
  • Presented in a specific order

67
Elshout-Mohr van Daalen-Kapteijns
  • Experiments with neologisms in 5 artificial
    contexts
  • When you are used to a view it is depressing when
    you live in a room with kolpers.
  • Superordinate information
  • At home he had to work by artificial light
    because of those kolpers.
  • During a heat wave, people want kolpers, so
    sun-blind sales increase.
  • Contexts showing 2 differences from the
    superordinate
  • I was afraid the room might have kolpers, but
    plenty of sunlight came into it.
  • This house has kolpers all summer until the
    leaves fall out.
  • Contexts showing 2 counterexamples due to the 2
    differences

68
State of the Art Psychology
  • Johnson-Laird 1987
  • Word understanding ? definition
  • Definitions arent stored
  • During the Renaissance, Bernini cast a bronze of
    a mastiff eating truffles.

69
State of the Art Psychology
  • Sternberg et al. 1983,1987
  • Cues to look for ( slots for frame)
  • Spatiotemporal cues
  • Value cues
  • Properties
  • Functions
  • Cause/enablement information
  • Class memberships
  • Synonyms/antonyms
  • To acquire new words from context
  • Distinguish relevant/irrelevant information
  • Selectively combine relevant information
  • Compare this information with previous beliefs

70
Sternberg
  • The couple there on the blind date was not
    enjoying the festivities in the least. An
    acapnotic, he disliked her smoking and when he
    removed his hat, she, who preferred ageless
    men, eyed his increasing phalacrosis and
    grimaced.

71
State of the Art Vocabulary Learning (II)
  • Some dubious contributions
  • Mueser 1984 Practicing Vocabulary in Context
  • BUT context definition !!
  • Clarke Nation 1980 a strategy (algorithm?)
  • Look at word context determine POS
  • Look at grammatical context
  • E.g., who does what to whom?
  • Look at wider context
  • E.g., search for Sternberg-like clues
  • Guess the word check your guess

72
CVA From Algorithm to Curriculum
  • guess the word
  • then a miracle occurs
  • Surely,
  • we computer scientists
  • can be more explicit!

73
CVA From algorithm to curriculum
  • Treat guess as a procedure call (subroutine)
  • Fill in the details with our algorithm
  • Convert the algorithm into a curriculum
  • To enhance students abilities to use deliberate
    CVA strategies
  • To improve reading comprehension
  • and back again!
  • Use knowledge gained from CVA case studies to
    improve the algorithm
  • I.e., use Cassie to learn how to teach humans
  • use humans to learn how to teach Cassie

74
  • Why not use a dictionary?
  • Because
  • People are lazy (!)
  • Dictionaries are not always available
  • Dictionaries are always incomplete
  • Dictionary definitions are not always useful
  • chaste df clean, spotless /? new dishes are
    chaste
  • college df a body of clergy living together
    and
  • supported by a foundation
  • Most words are learned via incidental CVA,
  • not via dictionaries
  • Most importantly
  • Dictionary definitions are just more contexts!

75
Why not use a dictionary?
  • Merriam-Webster New Collegiate Dictionary
  • chaste.
  • innocent of unlawful sexual intercourse
  • student stay away from that one!
  • celibate
  • student huh?
  • pure in thought and act modest
  • student I have to find a sentence for that?
  • a severely simple in design or execution
    austere
  • student huh? severely? austere?
  • b clean, spotless
  • student all right! The plates were still
    chaste after much use.
  • Deese 1967 / Miller 1985

76
Why not use a dictionary?
  • Merriam-Webster (continued)
  • college.
  • a body of clergy living together and supported by
    a foundation
  • a building used for an educational or religious
    purpose
  • a a self-governing constituent body of a
  • university offering living quarters and
  • instruction but not granting degrees
  • b a preparatory or high school
  • c an independent institution of higher
  • learning offering a course of general
  • studies leading to a bachelors
    degree
  • Problem ordering is historical!

77
Why not use a dictionary?
  • Merriam-Webster (continued)
  • infract infringe
  • infringe encroach
  • encroach
  • to enter by gradual steps or by stealth into the
    possessions or rights of another
  • to advance beyond the usual or proper limits
    trespass

78
Why not use a dictionary?
  • Collins COBUILD Dictionary
  • Helping Learners with Real English
  • chaste.
  • Someone who is chaste does not have sex with
    anyone, or only has sex with their husband or
    wife an old-fashioned use, used showing
    approval. EG She was a holy woman, innocent and
    chaste.
  • Something that is chaste is very simple in style,
    without much decoration. EG chaste houses built
    in 1732

79
Why not use a dictionary?
  • Collins COBUILD Dictionary
  • college.
  • A college is 1.1 an institution where students
    study for qualifications or do training courses
    after they have left school.
  • infract not in dictionary
  • infringe.
  • If you infringe a law or an agreement, you break
    it.
  • encroach.
  • To encroach on or upon something means to slowly
    take possession or control of it, so that someone
    else loses it bit by bit.

80
Question (objection)
  • Teaching computers ? teaching humans!
  • But
  • Our goal
  • Not teach people to think like computers
  • But to explicate computable teachable methods
    to hypothesize word meanings from context
  • AI as computational psychology
  • Devise computer programs that are essentially
    faithful simulations of human cognitive behavior
  • Can tell us something about human mind.
  • We are teaching a machine, to see if what we
    learn in teaching it can help us teach students
    better.

81
How Does Our System Work?
  • Uses a semantic network computer system
  • semantic networks concept maps
  • serves as a model of the reader
  • represents
  • readers prior knowledge
  • the text being read
  • can reason about the text and the readers
    knowledge

82
Fragment of readers prior knowledge m3 In
real life, white is a color m6 In real
life, harts are deer m8 In real life, deer
are mammals m11 In real life, halls are
buildings m12 In real life, b1 is named King
Arthur m14 In real life, b1 is a king (etc.)
83
m16 if v3 has property v2 if v2
is a color if v3 ? v1 then v1 is a kind of
physical object
84
Reading the story m17 In the story, b2 is a
hart m24 In the story, the hart runs into
b3 (b3 is King Arthurs hall) not shown (harts
are deer) not shown
85
The entire network showing the readers mental
context consisting of prior knowledge, the story,
inferences. The definition algorithm searches
this network abstracts parts of it to produce a
(preliminary) definition of brachet.
86
Implementation
  • SNePS (Stuart C. Shapiro SNeRG)
  • Intensional, propositional semantic-network
    knowledge-representation reasoning system
  • Node-based path-based reasoning
  • I.e., logical inference generalized inheritance
  • SNeBR belief revision system
  • Used for revision of definitions
  • SNaLPS natural-language input/output
  • Cassie computational cognitive agent

87
How It Works
  • SNePS represents
  • background knowledge text information
  • in a single, consolidated semantic network
  • Algorithms deductively search network for
    slot-fillers for definition frame
  • Search is guided by desired slots
  • E.g., prefers general info over particular info,
    but takes what it can get

88
Noun Algorithm
  • Find or infer
  • Basic-level class memberships (e.g., dog,
    rather than animal)
  • else most-specific-level class memberships
  • else names of individuals
  • Properties of Ns (else, of individual Ns)
  • Structure of Ns (else )
  • Functions of Ns (else )
  • Acts that Ns perform (else )
  • Agents that perform acts w.r.t. Ns
  • the acts they perform (else)
  • Ownership
  • Synonyms
  • Else do syntactic/algebraic manipulation
  • Al broke a vase ? a vase is something Al broke
  • Or a vase is a breakable physical object

89
Verb Algorithm
  • Find or infer
  • Predicate structure
  • Categorize arguments/cases
  • Results of Ving
  • Effects, state changes
  • Enabling conditions for V
  • Future work
  • Classification of verb-type
  • Synonyms
  • Also preliminary work on adjective algorithm

90
Belief Revision
  • Used to revise definitions of words with
    different sense from current meaning hypothesis
  • SNeBR (ATMS Martins Shapiro 88)
  • If inference leads to a contradiction, then
  • SNeBR asks user to remove culprit(s)
  • automatically removes consequences inferred
    from culprit
  • SNePSwD (SNePS w/ Defaults Martins Cravo 91)
  • Currently used to automate step 1, above
  • AutoBR (Johnson Shapiro, in progress)
  • new default reasoner (Bhushan Shapiro, in
    progress)
  • Will replace SNePSwD

91
Revision Expansion
  • Removal revision being automated via SNePSwD by
    ranking all propositions with kn_cat
  • most intrinsic info re language fundamental
    background info
  • certain (before is transitive)
  • story info in text (King Lot rode
    to town)
  • life background info w/o variables or
    inference
  • (dogs are animals)
  • story-comp info inferred from text (King
    Lot is a king, rode on a horse)
  • life-rule.1 everyday commonsense
    background info
  • (BearsLiveYoung(x) ? Mammal(x))
  • life-rule.2 specialized background info
  • (x smites y ? x kills y by
    hitting y)
  • least
  • certain questionable already-revised
    life-rule.2 not part of input

92
Belief Revision smite
  • Misunderstood word 2-stage subtractive
    revision
  • Background knowledge includes
  • () smite(x,y,t) ? hit(x,y,t) dead(y,t)
    cause(hit(x,y,t),dead(y,t))
  • P1 King Lot smote down King Arthur
  • D1 If person x smites person y at time t, then x
    hits y at t, and y is dead at t
  • Q1 What properties does King Arthur have?
  • R1 King Arthur is dead.
  • P2 King Arthur drew Excalibur.
  • Q2 When did King Arthur do this?
  • SNeBR is invoked
  • KAs drawing E is inconsistent with being dead
  • () replaced smite(x,y,t) ? hit(x,y,t)
    ?dead(y,t) dead(y,t) ? cause(hit, dead)
  • D2 If person x smites person y at time t, then
    x hits y at t ?(y is dead at t)
  • P3 another passage in which (smiting ?
    death)
  • D3 If person x smites person y at time t, then
    x hits y at t

93
Belief Revision dress
  • additive revision
  • Bkgd info includes
  • dresses(x,y) ? ?zclothing(z) wears(y,z)
  • Spears dont wear clothing (both
    kn_catlife.rule.1)
  • P1 King Arthur dressed himself.
  • D1 A person can dress itself result it wears
    clothing.
  • P2 King Claudius dressed his spear.
  • Cassie infers King Claudiuss spear wears
    clothing.
  • Q2 What wears clothing?
  • SNeBR is invoked
  • KCs spear wears clothing inconsistent with (2).
  • (1) replaced dresses(x,y) ? ?zclothing(z)
    wears(y,z) v NEWDEF
  • Replace (1), not (2), because of verb in
    antecedent of (1) (Gentner)
  • P3 other passages in which dressing spears
    precedes fighting
  • D2 A person can dress a spear or a person
  • result person wears clothing or person
    is enabled to fight

94
Figure out meaning of word from what?
  • context (i.e., the text)?
  • Werner Kaplan 52, McKeown 85, Schatz Baldwin
    86
  • context and readers background knowledge?
  • Granger 77, Sternberg 83, Hastings 94
  • context including background knowledge?
  • Nation Coady 88, Graesser Bower 90
  • Note
  • context text ? context is external to
    readers mind
  • Could also be spoken/visual/situative (still
    external)
  • background knowledge internal to readers
    mind
  • What is (or should be) the context for CVA?

95
Some Proposed Preliminary Definitions(to extract
order out of confusion)
  • Unknown word for a reader def
  • Word or phrase that reader has never seen before
  • Or only has vague idea of its meaning
  • Different levels of knowing meaning of word
  • Notation X

96
Proposed preliminary definitions
  • Text def
  • (written) passage
  • containing X
  • single phrase or sentence several paragraphs

97
Proposed preliminary definitions
  • Co-text of X in some text def
  • The entire text minus X i.e., entire text
    surrounding X
  • E.g., if X brachet, and text
  • There came a white hart running into the hall
    with a white brachet next to him, and thirty
    couples of black hounds came running after them.
  • Then Xs co-text in this text
  • There came a white hart running into the hall
    with a white ______ next to him, and thirty
    couples of black hounds came running after them.
  • Cf. cloze tests in psychology
  • But, in CVA, reader seeks meaning or definition
  • NOT a missing word or synonym Theres no
    correct answer!
  • Co-text is what many mean by context
  • BUT they shouldnt!

98
Proposed preliminary definitions
  • The readers prior knowledge def
  • the knowledge that the reader has when s/he
    begins to read the text
  • and is able to recall as needed while reading
  • knight picks up carries brachet ?? small
  • Warnings
  • knowledge ? truth
  • so, prior beliefs is better
  • prior vs. background vs. world, etc.
  • See next slide!

99
Proposed preliminary definitions
  • Possible synonyms for prior knowledge,
  • each with different connotation
  • Background knowledge
  • Can use for information that author assumes
    reader to have
  • World knowledge
  • General factual knowledge about things other than
    the texts topic
  • Domain knowledge
  • Specialized, subject-specific knowledge about the
    texts topic
  • Commonsense knowledge
  • Knowledge everyone has
  • E.g., CYC, cultural literacy (Hirsch)
  • These overlap
  • PK should include some CSK, might include some DK
  • BK might include much DK

100
Steps towards aProper Definition of Context
  • Step 1
  • The context of X for a reader def
  • The co-text of X
  • the readers prior knowledge
  • Both are needed!
  • After reading
  • the white brachet bit the hart in the buttock
  • most subjects infer that brachets are (probably)
    animals, from
  • That text, plus
  • Available PK premise If x bites y, then x is
    (probably) an animal.
  • Inference is not an enthymeme!
  • (argument with missing premise)

101
Proper definition of context
  • (inference not an enthymeme because)
  • When you read, you internalize the text
  • You bring it into your mind
  • Gärdenfors 1997, 1999 Jackendoff 2002
  • Missing premise might be in readers mind!
  • This internalized text is more important than
    the actual words on paper
  • Text Im going to put the cat out
  • Misread as Im going to put the car out
  • leads to different understanding of the text
  • What matters is what the reader thinks the text
    is,
  • Not what the text actually is
  • Therefore

102
Proper definition of context
  • Step 2
  • The context of X for a reader def
  • A single KB, consisting of
  • The readers internalized co-text of X
  • the readers prior knowledge

103
Proper definition of context
  • But What is ?
  • Not mere conjunction or union!
  • Active readers make inferences while reading.
  • From text a white brachet
  • prior commonsense knowledge only physical
    objects have color,
  • reader might infer that brachets are physical
    objects
  • From The knight took up the brachet and rode
    away with the brachet.
  • prior commonsense knowledge about size,
  • reader might infer that brachet is small enough
    to be carried
  • Whole gt sum of parts
  • inference from internalized text PK ? new
    info not in text or in PK
  • I.e., you can learn from reading!

104
Proper definition of context
  • But Whole lt sum of parts!
  • Reader can learn that some prior beliefs were
    mistaken
  • Or reader can decide that text is mistaken
    (less likely)
  • Reading CVA need belief revision!
  • operation
  • input PK internalized co-text
  • output belief-revised integration of input,
    via
  • Expansion
  • addition of new beliefs from ICT into PK, plus
    new inferences
  • Revision
  • retraction of inconsistent prior beliefs together
    with inferences from them
  • Consolidation
  • eliminate further inconsistencies

105
Prior Knowledge
Text
PK1 PK2 PK3 PK4
106
Prior Knowledge
Text
T1
PK1 PK2 PK3 PK4
107
Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
108
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
inference
P5
109
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
P5
I(T2)
P6
110
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
111
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
112
Note All contextual reasoning is done in this
context
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
P7
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
113
Proper definition of context
  • One more detail X needs to be internalized
  • Context is a 3-place relation among
  • Reader, word, and text
  • Final(?) def.
  • Let T be a text
  • Let R be a reader of T
  • Let X be a word in T (that is unknown to R)
  • Let T-X be Xs co-text in T.
  • Then
  • The context that R should use to hypothesize a
    meaning for Rs internalization of X as it occurs
    in T def
  • The belief-revised integration of Rs prior
    knowledge with Rs internalization of T-X.

114
This definition agrees with
  • Cognitive-science reading-theoretic views of
    text understanding
  • Schank 1982, Rumelhart 1985, etc.
  • AI techniques for text understanding
  • Readers mind modeled by KB of prior knowledge
  • Expressed in AI language (for us SNePS)
  • Computational cognitive agent reads the text,
  • integrating text info into its KB, and
  • making inferences performing belief revision
    along the way
  • When asked to define a word,
  • Agent deductively searches this single,
    integrated KB for information to fill slots of a
    definition frame
  • Agents context for CVA this single,
    integrated KB

115
Distinguishing Prior Knowledge from Integrated
Co-Text
  • So KB can be disentangled as needed for belief
    revision or to control inference
  • Each proposition in the single, integrated KB is
    marked with its source
  • Originally from PK
  • Originally from text
  • Inferred
  • Sources of premises

116
Some Open Questions
  • Roles of spoken/visual/situative contexts
  • Relation of CVA context to formal theories of
    context (e.g., McCarthy, Guha)
  • Relation of I(T) to prior-KB e.g.
  • Is I(Ti) true in prior-KB?
  • It is accepted pro tem.
  • Is I(T) a subcontext of pKB or B-R KB?
  • How to activate relevant prior knowledge.
  • Etc.

117
Research Methodology
  • AI team
  • Develop, implement, test better computational
    theories of CVA
  • Translate into English for use by reading team
  • Reading team
  • Convert algorithms to curriculum
  • Think-aloud protocols
  • To gather new data for use by AI team
  • As curricular technique (case studies)

118
Problem in ConvertingAlgorithm into Curriculum
  • A knight picks up a brachet and carries it away
  • Cassie
  • Has perfect memory
  • Is perfect reasoner
  • Automatically infers that brachet is small
  • People dont always realize this
  • May need prompting How big is the brachet?
  • May need relevant background knowledge
  • May need help in drawing inferences
  • Teaching CVA ? teaching general reading
    comprehension
  • Vocabulary knowledge correlates with reading
    comprehension

119
CVA Science Education
  • Original goal CVA in for science education
  • Use CVA to improve reading of STEM materials
  • A side effect CVA as science education
  • There are no ultimate authorities to consult
  • No answers in the back of the book of life!
  • As true for STEM as it is for reading about STEM
  • ? Goal of education
  • To learn how to learn on ones own
  • Help develop confidence desire to use that
    skill
  • CVA as scientific method in miniature furthers
    this goal
  • Find clues/evidence (gathering data)
  • Integrate them with personal background knowledge
  • Use together to develop new theory (e.g., new
    meaning)
  • Test/revise new theory (on future encounters with
    word)

120
CVA Geography (? STEM)
  • Use texts w/ unknown geographic terms
  • estuary
  • proximity (IGERTs own Valerie Raybold Yakich
    ?)
  • L1 acquisition of spatial terms
  • Childrens concepts ? adult concepts
  • L2 acquisition of spatial terms
  • L2 spatial concepts ? L1 spatial concepts
  • Especially spatial prepositions

121
Conclusion
  • Developing a computational theory of CVA,
  • which can become
  • a useful educational technique for improving
    vocabulary and reading comprehension
  • a model of the scientific method
  • a useful tool for learning on ones own.

122
Participants
Santosh Basapur (IE) Adam Lammert (Vasser/ugrad) Taha Suglatwala (CSE)
Aishwarya Bhave (CSE) Amanda MacRae (LAI) Matthew Sweeney (ENG/CSE/ugrad)
Marc Broklawski (CSE) Brian Morgan (LAI) Matthew Watkins (CEN)
Chien-chih Chi (PHI) Scott Napieralski (CSE) Karen Wieland (LAI)
Justin Del Vecchio (CSE) Vikranth Rao (CSE/ugrad) Yulai Xie (CSE)
Karen Ehrlich (Fredonia) Laurie Schimpf (LAI) Valerie Raybold Yakich (GEO)
Jeffrey Galko (PHI) Ashwin Shah (CSE) SNeRG members
Christopher Garver (CSE) Stuart C. Shapiro (UB/CSE) ( new students, Spring 2003
Paul Gestwicki (CSE) Anuroopa Shenoy (CSE) Fall 2004
Kazuhiro Kawachi (LIN) Rajeev Sood (CSE) ( supported on NSF grant)
123
Web Page
  • http//www.cse.buffalo.edu/rapaport/cva.html

124
An Analysis of Think-Aloud Protocol of Good
Readers Using CVA Strategies During Silent Reading
125
Facilitating Vocabulary Growth
  • Word fun.
  • Roots, prefixes, affixes.
  • Dictionary.
  • Wide reading.
  • Contextual vocabulary acquisitionCVA.

126
Limited Data Describing CVA Processes
  • Nationguess.
  • Ames.
  • Deighton.
  • Sternberg.
  • Elshout-Mohr van Daalen-Kapteijns.
  • Harmon.

127
Overview of Our Study
  • We asked good readers to think-aloud when they
    encountered a word whose meaning they did not
    know as they silently read a set of 7-17 texts,
    each text containing at least one instance of the
    unknown word.

128
Overview continued
  • We analyzed the think-aloud protocol to gain
    understanding of the CVA processes.
  • Besides understanding CVA processes, our goal was
    to build a more effective CVA curriculum.

129
Beginning and Ending Research Questions
  • What text cues are used for CVA?
  • What are CVA reasoning processes?
  • What sense or meaning of an unknown word is
    gained from CVA?
  • How is information from prior encounters with a
    hard word used when the word is seen in new texts?

130
Methodology Hard Words
  • Identified small set of hard words.
  • Some words from Ehrlich Rapaport earlier work
    with Cassie.
  • Dale ORourke and Carroll, Davis Richman as a
    guide.
  • Some words came from scanning science and current
    event texts.

131
Methodology Text Sets
  • For each word, identified a set of 7-17 authentic
    texts, each with 1 or more instances of the word.
  • Hard words were in boldface font.
  • Sometimes, replaced real hard word with a
    neologism (a non-word) e.g.,
  • itresia for estuary

132
Methodology Participants
  • High school students.
  • Excellent or outstanding readers.
  • Readers given pre-test.

133
(No Transcript)
134
(No Transcript)
135
Methodology Procedures
  • Worked 1-1 with researcher.
  • Read each passage, one at a time.
  • Researcher provided meaning of other words in
    text reader did not know.

136
Procedures continued
  • When hard word encountered, reader thought aloud
    while trying to gain sense of words meaning.
  • Audio tape recorded think-alouds.
  • Transcribed taped think-alouds.

137
Methodology Analyses of Verbal Protocol
  • Processing of texts and hard words.
  • Use of external context cues in CVA.
  • Reasoning processes in CVA.
  • Sense of word meaning gained from CVA.

138
Research Assumptions Good Readers
  • Will know and be able to apply multiple CVA
    processes.
  • Are wide readers who have incidentally or
    deliberatelylearned many words from reading.
  • Will have excellent comprehension of text.

139
Research Assumptions CVA Processes
  • CVA processes are a set of sub-strategies
    activated by a disruption of text processing
    caused by encountering an unknown word.

140
Research assumptions CVA processes continued
  • CVA shares many characteristics of reading
    comprehension (e.g., use of selective text cues,
    prior knowledge, reasoning), but triggers text
    processing strategies different than ordinary
    comprehension fix-up strategies.

141
Research Assumptions CVA for Word Learning and
Reading Comprehension
  • Most readers apply CVA processes to gain meaning
    from the text, therefore gain word meanings
    incidentally.
  • We ask readers to try to gain a sense of the
    unknown words meaning deliberately.

142
Research Assumption Conditional Factors Needed
for Applying CVA
  • Disrupted comprehension is required, or reader
    may just skip word.
  • Word awareness i.e., reader must note that there
    is a hard word (Reed, 1957 Harmon, 1999).

143
Overview of Findings
  1. Processing of texts and hard words in CVA.
  2. Use of text cues in CVA.

144
Overview of findings continued
  • Reasoning processes in CVA.
  • Hypotheses or model building.
  • Inferential / abstract reasoning from reading
    comprehension.
  • Within-sentence language cues.
  • Information processing / knowledge acquisition
    processes (Sternberg, 1987)
  • Global strategies.
  • Prior knowledge in CVA processes.

145
Overview of findings continued
  1. Sense of the word meaning from CVA.

146
Findings1Processing Texts and Hard Words
  • After encountering hard word, some readers seemed
    to continue to read the full passage, then
    returning to the word to work on its meaning.
  • Some readers stopped reading upon the hard word
    (or read to the end of the sentence), and worked
    on the word meaning immediately.

147
Findings2 Use of External Text Cues
  • First, we classified the textual cues in texts
    using Ames (1969), Deighton (1978), Sternberg
    Powell (1983), Ehrlich (1995), and Sternberg
    (1987).
  • Second, we analyzed the think-aloud protocol to
    see if these were the clues readers used.

148
Use of external text cues continued
  • When reading and encountering an unknown word,
    readers generally started CVA with reasoning
    processes.
  • Sometimes they went back to text, sometimes they
    did not.

149
Use of external text cues continued
  • After forming a hypothesis, some readers
    reinspected text to find support for hypothesis.
  • Some readers created a hypothesis using general
    passage and sentence meaning, but did not go back
    to text.

150
Use of external text cues continued
  • Others said there was nothing in text to help
    them gain a sense of word meaning.
  • Other readers did not go back to text at all
    unless prompted.

151
Use of external text cues continued
  • Readers inferred a sense of the word on the basis
    of
  • general passage meaning,
  • meaning of the specific sentence,
  • sentence language and syntax,
  • prior knowledge, and
  • prior passages.

152
Use of external text cues continued
  • When readers did refer to the passage for
    specific information
  • Usually to confirm a hypothesis.
  • Did not generally select the sentences we had
    predicted they would use.

153
These Findings Lead Us in Three Directions
  • An unpredicted, butwith hindsight logical
    conclusion.
  • Curricular implications.
  • Abandoning coding available cues in the text to
    coding reasoning processes applied to word
    meaning.

154
Why is it Readers Did Not First Reinspect Text
for Cues?
  • They did not know the words meaning, so those
    text cues had no particular salience for the
    reader.
  • For the researchers, however, these cues were
    salient, because we already knew the words
    meaning.
  • Readers varied in what they accepted as a
    sufficient hypothesis.
  • We did not teach readers specific cues to look
    forwanted to see what they did independently.

155
  • That is, when a reader knows a words meaning,
    that words connection to all the cues in the
    text is obvious.
  • But when one does not know the meaning of the
    word, one does not readily discern the cues that
    provide insight to the words meaning.

156
Our Conclusion
  • Using Context for CVA is Easy When You Already
    Know Meaning of Word!

157
Curricular Implications
  • Teachers should model CVA with words they do not
    know in texts they have not previously seen.
  • Students should practice CVA with words they do
    not know in texts they have not previously seen.

158
Findings-3 Reasoning Processes in CVA
  • Hypotheses or model building
  • Inferring / abstract reasoning from reading
    comprehension
  • Language cues
  • Global strategies
  • Background knowledge
  • Conclusion

159
Findings Hypothesis Building
  • All readers hypothesized a meaning of the hard
    word.

160
On Further Encounters With Word, Readers
  • Confirmed hypothesis if congruent with text,
    usually stating rationale
  • Revised hypothesis if not congruent, usually
    stating rationale or
  • If hypothesis not congruent with text, but not
    enough information in text to revise it, readers
    questioned hypothesis.

161
Like Elshout-Mohr van Daalen-Kapteijns Good
Readers
  • Our readers generally modified hypotheses in
    keeping with text.
  • Our readers seemed to be aware that they did not
    really know the words meaning, that what they
    knew was a hypothesis.

162
Unlike Deegan (1995 )
  • Our readers rarely altered the meaning of text to
    stay in keeping with prior hypothesis.

163
Within Sentence Language Cues
  • Familiar expressions
  • Figurative language
  • Connected series

164
Inferring / Abstract Reasoning from Reading
Comprehension
  • Encoding selected information.
  • Combining selected text information.
  • Comparing selected information from text to
    background knowledge.

165
Global Comprehension Strategies
  • Visualizing.
  • Summarizing.
  • Clarifying.
  • Self questioning.
  • Insight.
  • Confirming / confidence.

166
Background Knowledge
  • Essential
  • Idiosyncratic
  • Pervasive

167
Findings4 Sense of the Word Meaning from CVA
  • Right and wrong are not useful descriptors of
    appropriateness of word meanings from CVA
    processes.
  • Rational and defensible are better descriptors of
    appropriateness than is right or wrong.
  • Gradual and cumulative over texts.

168
Sense of the word continued
  • 0. No meaning provided.
  • Dont Know (dk)
  • IncorrectNo logical justification for sense of
    word.
  • IncorrectReasonable justification for sense of
    word proffered.
  • IncorrectBased on language patterns, not general
    text meaning.

169
Sense of the word continued
  • Vague or partial word meaning sense.
  • Approximate word meaning sense.
  • Nearly correct word meaning sense.
  • Correct sense of word meaning.

170
How to Improve CVA in Classrooms
  • Guess?
  • Magical Mathematical Formula?
  • CVA Strategies!

171
How to improve CVA continued
  • Teacher modeling of CVA strategies in texts with
    words whose meanings are not known.
  • Scaffolding groups as they together think-aloud
    when applying CVA strategies in texts with words
    whose meanings are not known.

172
How to improve CVA continued
  • Guiding small groups in think-alouds of CVA
    strategies in texts with words whose meanings are
    not known.
  • Students independent application of CVA
    strategies in texts with words whose meanings are
    not known.

173
Protocol Study Limitations
  • Reading done in a research environment.
  • We required reader to think about the unknown
    word and its meaning.
  • Readers ordinarily might choose to skip or ignore
    word.
  • Readers ordinarily might not note word.

174
Protocol limitations continued
  • Our readers encountered word in multiple,
    consecutive texts.
  • Therefore, readers had immediate memory of the
    previous encounter.
  • This is atypical, as readers ordinarily might not
    encounter new word a second time for a long
    period.

175
Protocol limitations continued
  • We sometimes ask leading questions e.g.,
  • To activate background knowledge.
  • To direct reader back to text.
  • To elicit reasoning processes.
  • To ask reader to think again.

176
Protocol limitations continued
  • At times, we used non-words in place of real
    words e.g.,
  • schmalion for tatterdemalion
  • vedosarn for taciturn
  • itresia for estuary.

177
Protocol Study Strengths
  • Used authentic texts.
  • Words were, generally, difficult and not
    previously known by readers.
Write a Comment
User Comments (0)
About PowerShow.com