cva-727.ppt - PowerPoint PPT Presentation

About This Presentation
Title:

cva-727.ppt

Description:

Active, conscious acquisition of a meaning for a word in a text by reasoning from 'context' ... Mills Wedding and Specialty Cakes. On Misinterpretation ... – PowerPoint PPT presentation

Number of Views:212
Avg rating:3.0/5.0
Slides: 113
Provided by: cse17
Learn more at: https://cse.buffalo.edu
Category:
Tags: cakes | cva | ppt | wedding

less

Transcript and Presenter's Notes

Title: cva-727.ppt


1
cva-727.ppt
  • 20070122

2
Contextual Vocabulary AcquisitionFrom Algorithm
to Curriculum
  • William J. Rapaport
  • Department of Computer Science Engineering,
  • Department of Philosophy, Department of
    Linguistics,
  • and Center for Cognitive Science
  • rapaport_at_cse.buffalo.edu
  • http//www.cse.buffalo.edu/rapaport

3
Contextual Vocabulary Acquisition
  • Active, conscious acquisition of a meaning for a
    word in a text by reasoning from context
  • CVA what you do when
  • Youre reading
  • You come to an unfamiliar word
  • Its important for understanding the passage
  • No ones around to ask
  • Dictionary doesnt help
  • No dictionary
  • Too lazy to look it up -)
  • Word not in dictionary
  • Definition of no use
  • Too hard
  • Inappropriate
  • So, you figure out a meaning for the word from
    context
  • figure out compute (infer) a hypothesis
    about what the word might mean in that
    text
  • context ??

4
Overview of CVA Project
  • From Algorithm to Curriculum
  • Implemented computational theory of how to figure
    out (compute) a meaning for an unfamiliar word
    from wide context
  • Based on
  • algorithms developed by Karen Ehrlich (1995)
  • verbal protocols (case studies)
  • Implemented in a semantic-network-basedknowledge-
    representation reasoning system
  • SNePS (Stuart C. Shapiro colleagues)

5
Overview of CVA Project (contd
  • From Algorithm to Curriculum
  • Convert algorithms to an improved, teachable
    curriculum
  • To improve vocabulary reading comprehension
  • Joint work with Michael Kibby
  • Center for Literacy Reading
    Instruction

6
Meaning of Meaning
  • the meaning of a word vs. a meaning for a
    word
  • the ? single, correct meaning
  • of ? meaning belongs to word
  • a ? many possible meanings depending on
    textual context,
  • readers prior knowledge, etc.
  • for ? reader hypothesizes meaning from
    context, gives it to word

7
  • The meaning of things lies not in themselves
    but in our attitudes toward them.
  • Antoine de Saint-Exupéry, Wisdom of the Sands
    (1948)
  • Words dont have meaning theyre cues to
    meaning!
  • Jeffrey L. Elman, On Dinosaur Bones the
    Meaning of Words (2007)
  • We cannot locate meaning in the text this is
    an active, dynamic process, existing only in
    interactive behaviors of cultural, social,
    biological, and physical environment-systems.
  • William J. Clancey, Scientific Antecedents of
    Situated Cognition (forthcoming)

8
CVA as Cognitive Science
  • Studied in
  • AI / computational linguistics
  • Psychology
  • Child-language development (L1 acquisition)
  • L2 acquisition (e.g., ESL)
  • Reading education (vocabulary development)
  • Thus far multi-disciplinary
  • Not yet inter-disciplinary!

9
What Does Brachet Mean?(From Malorys Morte
DArthur page in brackets)
  • 1. There came a white hart running into the
    hall with a white brachet next to him, and thirty
    couples of black hounds came running after them.
    66
  • As the hart went by the sideboard,the white
    brachet bit him. 66
  • The knight arose, took up the brachet androde
    away with the brachet. 66
  • A lady came in and cried aloud to King
    Arthur,Sire, the brachet is mine. 66
  • There was the white brachet which bayed at him
    fast. 72
  • 18. The hart lay dead a brachet was biting on
    his throat,and other hounds came behind. 86

10
Figure out meaning of word from what?
  • context (i.e., the text)?
  • Werner Kaplan 52, McKeown 85, Schatz Baldwin
    86
  • context and readers background knowledge?
  • Granger 77, Sternberg 83, Hastings 94
  • context including background knowledge?
  • Nation Coady 88, Graesser Bower 90
  • Note
  • context text ? context is external to
    readers mind
  • Could also be spoken/visual/situative (still
    external)
  • background knowledge internal to readers
    mind
  • What is (or should be) the context for CVA?

11
What Is the Context for CVA?
  • context ? textual context
  • surrounding words co-text of word
  • context wide context
  • internalized co-text
  • readers interpretive mental model of textual
    co-text
  • involves local interpretation (McKoon
    Ratcliff) proN resoln, simple infs (prop names)
  • global interpretation (full use of available
    PK)
  • can involve misinterpretation
  • integrated via belief revision
  • infer new beliefs from internalized co-text
    prior knowledge
  • remove inconsistent beliefs
  • with readers prior knowledge
  • world knowledge
  • language knowledge
  • previous hypotheses about words meaning
  • but not including external sources (dictionary,
    humans)
  • ? Context for CVA is in readers mind, not in
    the text

12
Some Proposed Preliminary Definitions(to extract
order out of confusion)
  • Unknown word for a reader def
  • Word or phrase that reader has never seen before
  • Or only has vague idea of its meaning
  • Different levels of knowing meaning of word
  • Notation X

13
Proposed preliminary definitions
  • Text def
  • (written) passage
  • containing X
  • single phrase or sentence several paragraphs

14
Proposed preliminary definitions
  • Co-text of X in some text def
  • The entire text minus X i.e., entire text
    surrounding X
  • E.g., if X brachet, and text
  • There came a white hart running into the hall
    with a white brachet next to him, and thirty
    couples of black hounds came running after them.
  • Then Xs co-text in this text
  • There came a white hart running into the hall
    with a white ______ next to him, and thirty
    couples of black hounds came running after them.
  • Cf. cloze tests in psychology
  • But, in CVA, reader seeks meaning or definition
  • NOT a missing word or synonym Theres no
    correct answer!
  • Co-text is what many mean by context
  • BUT they shouldnt!

15
Proposed preliminary definitions
  • The readers prior knowledge def
  • the knowledge that the reader has when s/he
    begins to read the text
  • and is able to recall as needed while reading
  • knight picks up carries brachet ?? small
  • Warnings
  • knowledge ? truth
  • so, prior beliefs is better
  • prior vs. background vs. world, etc.
  • See next slide!

16
Proposed preliminary definitions
  • Possible synonyms for prior knowledge,
  • each with different connotation
  • Background knowledge
  • Can use for information that author assumes
    reader to have
  • World knowledge
  • General factual knowledge about things other than
    the texts topic
  • Domain knowledge
  • Specialized, subject-specific knowledge about the
    texts topic
  • Commonsense knowledge
  • Knowledge everyone has
  • E.g., CYC, cultural literacy (Hirsch)
  • These overlap
  • PK should include some CSK, might include some DK
  • BK might include much DK

17
Steps towards aProper Definition of Context
  • Step 1
  • The context of X for a reader def
  • The co-text of X
  • the readers prior knowledge
  • Both are needed!
  • After reading
  • the white brachet bit the hart in the buttock
  • most subjects infer that brachets are (probably)
    animals, from
  • That text, plus
  • Available PK premise If x bites y, then x is
    (probably) an animal.
  • Inference is not an enthymeme! (because )

18
Proper definition of context
  • But (inference not an enthymeme because)
  • When you read, you internalize the text
  • You bring it into your mind
  • Gärdenfors 1997, 1999 Jackendoff 2002
  • This internalized text is more important than
    the actual words on paper
  • Text Im going to put the cat out
  • Misread as Im going to put the car out
  • leads to different understanding of the text
  • What matters is what the reader thinks the text
    is,
  • Not what the text actually is
  • Therefore

19
On Misinterpretation
  • Sign seen on truck parked outside of cafeteria at
    Student Union
  • Mills Wedding and Specialty Cakes

20
On Misinterpretation
  • Sign seen on truck parked outside of cafeteria at
    Student Union
  • Mills Welding and Specialty Gases

21
Proper definition of context
  • Step 2
  • The context of X for a reader def
  • A single KB, consisting of
  • The readers internalized co-text of X
  • the readers prior knowledge

22
Proper definition of context
  • But What is ?
  • Not mere conjunction or union!
  • Active readers make inferences while reading.
  • From text a white brachet
  • prior commonsense knowledge only physical
    objects have color,
  • reader might infer that brachets are physical
    objects
  • From The knight took up the brachet and rode
    away with the brachet.
  • prior commonsense knowledge about size,
  • reader might infer that brachet is small enough
    to be carried
  • Whole gt S parts
  • inference from internalized text PK ? new
    info not in text or in PK
  • I.e., you can learn from reading!

23
Proper definition of context
  • But Whole lt S parts!
  • Reader can learn that some prior beliefs were
    mistaken
  • Or reader can decide that text is mistaken
    (less likely)
  • Reading CVA need belief revision!
  • operation
  • input PK internalized co-text
  • output belief-revised integration of input,
    via
  • Expansion
  • addition of new beliefs from ICT into PK, plus
    new inferences
  • Revision
  • retraction of inconsistent prior beliefs together
    with inferences from them
  • Consolidation
  • eliminate further inconsistencies

24
Prior Knowledge
Text
PK1 PK2 PK3 PK4
25
Prior Knowledge
Text
T1
PK1 PK2 PK3 PK4
26
Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
27
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
inference
P5
28
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
P5
I(T2)
P6
29
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
30
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
31
Note All contextual reasoning is done in this
context
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
P7
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
32
Note All contextual reasoning is done in this
context
B-R Integrated KB (the readers mind)
Text
T1
internalization
PK1 PK2 PK3 PK4
P7
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
33
Proper definition of context
  • One more detail X needs to be internalized
  • Context is a 3-place relation among
  • Reader, word, and text
  • Final(?) def.
  • Let T be a text
  • Let R be a reader of T
  • Let X be a word in T (that is unknown to R)
  • Let T-X be Xs co-text in T.
  • Then
  • The context that R should use to hypothesize a
    meaning for Rs internalization of X as it occurs
    in T def
  • The belief-revised integration of Rs prior
    knowledge with Rs internalization of T-X.

34
This definition agrees with
  • Cognitive-science reading-theoretic views of
    text understanding
  • Schank 1982, Rumelhart 1985, etc.
  • KRR techniques for text understanding
  • Readers mind modeled by KB of prior knowledge
  • Expressed in KR language (for us SNePS)
  • Computational cognitive agent reads the text,
  • integrating text info into its KB, and
  • making inferences performing belief revision
    along the way
  • When asked to define a word,
  • Agent deductively searches this single,
    integrated KB for information to fill slots of a
    definition frame
  • Agents context for CVA this single,
    integrated KB

35
Distinguishing Prior Knowledge from Integrated
Co-Text
  • So KB can be disentangled as needed for belief
    revision or to control inference
  • Each proposition in the single, integrated KB is
    marked with its source
  • Originally from PK
  • Originally from text
  • Inferred
  • Sources of premises

36
Some Open Questions
  • Roles of spoken/visual/situative contexts
  • Relation of CVA context to formal theories of
    context (e.g., McCarthy, Guha)
  • Relation of I(T) to prior-KB e.g.
  • Is I(Ti) true in prior-KB?
  • It is accepted pro tem.
  • Is I(T) a subcontext of pKB or B-R KB?
  • How to activate relevant prior knowledge.
  • Etc.

37
Background of CVA Project
  • People do incidental (unconscious) CVA
  • Possibly best explanation of how we learn
    vocabulary
  • Given of words high-school grad knows (45K),
    of years to learn them (18) 2.5K words/year
  • But only taught 10 in 12 school years
  • Students are taught deliberate (conscious)
    CVAin order to improve their vocabulary

38
1. Computational CVA
  • Implemented in SNePS (Shapiro 1979 Shapiro
    Rapaport 1992)
  • Intensional, propositional semantic-networkknowle
    dge-representation, reasoning, acting system
  • Indexed by node From any node, can describe
    rest of network
  • Serves as model of the reader (Cassie)
  • KB SNePS representation of readers prior
    knowledge
  • I/P SNePS representation of word in its
    co-text
  • Processing (simulates/models/is?! reading)
  • Uses logical inference, generalized inheritance,
    belief revisionto reason about text integrated
    with readers prior knowledge
  • N V definition algorithms deductively search
    this belief-revised, integrated KB (the
    context) for slot fillers for definition frame
  • O/P Definition frame
  • slots (features) classes, structure, actions,
    properties, etc.
  • fillers (values) info gleaned from context (
    integrated KB)

39
Cassie learns what brachet meansBackground
info about harts, animals, King Arthur, etc.No
info about brachetsInput formal-language
(SNePS) version of simplified EnglishA hart
runs into King Arthurs hall. In the story, B12
is a hart. In the story, B13 is a hall. In
the story, B13 is King Arthurs. In the story,
B12 runs into B13.A white brachet is next to
the hart. In the story, B14 is a brachet. In
the story, B14 has the property white.
Therefore, brachets are physical objects.
(deduced while reading PK Cassie believes
that only physical objects have color)
40
--gt (defineNoun "brachet") Definition of
brachet Class Inclusions phys obj, Possible
Properties white, Possibly Similar Items
animal, mammal, deer, horse, pony, dog,
I.e., a brachet is a physical object that can be
white and that might be like an animal,
mammal, deer, horse, pony, or dog
41
A hart runs into King Arthurs hall.A white
brachet is next to the hart.The brachet bites
the harts buttock. PK Only animals
bite--gt (defineNoun "brachet") Definition of
brachet Class Inclusions animal, Possible
Actions bite buttock, Possible Properties
white, Possibly Similar Items mammal, pony,
42
A hart runs into King Arthurs hall. A white
brachet is next to the hart. The brachet bites
the harts buttock. The knight picks up the
brachet. The knight carries the brachet. PK
Only small things can be picked up/carried --gt
(defineNoun "brachet") Definition of brachet
Class Inclusions animal, Possible Actions
bite buttock, Possible Properties small,
white, Possibly Similar Items mammal, pony,
43
A hart runs into King Arthurs hall.A white
brachet is next to the hart.The brachet bites
the harts buttock.The knight picks up the
brachet.The knight carries the brachet.The lady
says that she wants the brachet. PK Only
valuable things are wanted--gt (defineNoun
"brachet") Definition of brachet Class
Inclusions animal, Possible Actions bite
buttock, Possible Properties valuable, small,
white, Possibly Similar Items mammal,
pony,
44
  • A hart runs into King Arthurs hall.A white
    brachet is next to the hart.The brachet bites
    the harts buttock.The knight picks up the
    brachet.The knight carries the brachet.The lady
    says that she wants the brachet.
  • The brachet bays at Sir Tor. PK Only hunting
    dogs bay
  • --gt (defineNoun "brachet")
  • Definition of brachet
  • Class Inclusions hound, dog,
  • Possible Actions bite buttock, bay, hunt,
  • Possible Properties valuable, small, white,
  • I.e. A brachet is a hound (a kind of dog) that
    can bite, bay, and hunt,
  • and that may be valuable, small, and white.

45
General Comments
  • Cassies behavior ? human protocols
  • Cassies definition ? OEDs definition
  • A brachet is a kind of hound which hunts by
    scent

46
How Does Our System Work?
  • Uses a semantic network computer system
  • semantic networks concept maps
  • serves as a model of the reader
  • represents
  • readers prior knowledge
  • the text being read
  • can reason about the text and the readers
    knowledge

47
Fragment of readers prior knowledge m3 In
real life, white is a color
Member(Lex(white),Lex(color),LIFE) m6 In
real life, harts are deer
AKO(Lex(hart),Lex(deer),LIFE) m8 In real
life, deer are mammals
AKO(Lex(deer),Lex(mammal),LIFE) m11 In real
life, halls are buildings
AKO(Lex(hall),Lex(building),LIFE) m12 In real
life, b1 is named King Arthur
Name(b1,King Arthur,LIFE) m14 In real life,
b1 is a king Isa(ISA,b1,Lex(king),LIFE)
(etc.)
48
m16 if v3 has property v2 v2 is a color v3
? v1 then v1 is a class of physical
objects all(x,y,z)(Is1(z,y),Member1(y,lex(color))
,Member1(z,x) gt
AKO1(x,lex(physical object)))
49
Reading the story m17 In the story, b2 is a
hart ISA(b2,lex(hart),STORY) m24 In
the story, the hart runs into b3 Does(b2,into(b3,
lex(run)),STORY) (b3 is King Arthurs hall) not
shown (harts are deer) not shown
50
A fragment of the entire network showing the
readers mental context consisting of prior
knowledge, the story, inferences. The
definition algorithm searches this entire
network,abstracts parts of it, produces a
hypothesized meaning for brachet.
51
Implementation
  • SNePS (Stuart C. Shapiro SNeRG)
  • Intensional, propositional semantic-network
    knowledge-representation reasoning system
  • Formula-based path-based reasoning
  • I.e., logical inference generalized
    inheritance
  • SNeBR belief revision system
  • Used for revision of definitions
  • SNaLPS natural-language input/output
  • Cassie computational cognitive agent

52
How It Works
  • SNePS represents
  • background knowledge text information
  • in a single, consolidated semantic network
  • Algorithms deductively search network for
    slot-fillers for definition frame
  • Search is guided by desired slots
  • E.g., prefers general info over particular info,
    but takes what it can get

53
The Algorithms
  • Generate initial hypothesis bysyntactic
    manipulation
  • Algebra Solve an equation for unknown value X
  • Syntax Solve a sentence for unknown word X
  • A white brachet (X) is next to the hart? X (a
    brachet) is something that is next to the hart
    and that can be white.
  • I.e., define node X in terms of immediately
    connected nodes
  • Deductively search wide context for more
    information
  • I.e., define word X in terms of some (but not
    all) other connected nodes
  • Return definition frame.

54
Noun Algorithm
  • Generate initial hypothesis by syntactic
    manipulation
  • Then find or infer from wide context
  • Basic-level class memberships (e.g., dog,
    rather than animal)
  • else most-specific-level class memberships
  • else names of individuals
  • Properties of Xs (else, of individual Xs) (e.g.,
    size, color, )
  • Structure of Xs (else ) (part-whole, physical
    structure)
  • Acts that Xs perform (else ) or that can be done
    to/with Xs
  • Agents that do things to/with Xs
  • or to whom things can be done with Xs
  • or that own Xs
  • Possible synonyms, antonyms

55
Verb Algorithm
  • Generate initial hypothesis by syntactic
    manipulation
  • Then find or infer from wide context
  • Class membership (e.g., Conceptual Dependency)
  • What kind of act is X-ing (e.g., walking is a
    kind of moving)
  • What kinds of acts are X-ings (e.g., sauntering
    is a kind of walking)
  • Properties/manners of X-ing (e.g., moving by
    foot, slow walking)
  • Transitivity/subcategorization information
  • Return class membership of agent, object,
    indirect object, instrument
  • Possible synonyms, antonyms
  • Causes effects
  • Also preliminary work on adjective/adverb
    algorithm

56
Computational cognitive theory of how to learn
word meanings from context (cont.)
  • 3 kinds of vocabulary acquisition
  • Construct new definition of unknown word
  • What does brachet mean?
  • Fully revise definition of misunderstood word
  • Does smiting entail killing?
  • Expand definition of word used in new sense
  • Can you dress (i.e., clothe) a spear?
  • Initial hypothesis
  • Revision(s) upon further encounter(s)
  • Converges to stable, dictionary-like
    definition
  • Subject to revision

57
Belief Revision
  • To revise definitions of words used
    inconsistently with current meaning hypothesis
  • SNeBR (ATMS Martins Shapiro 1988, Johnson
    2006)
  • If inference leads to a contradiction, then
  • SNeBR asks user to remove culprit(s)
  • automatically removes consequences inferred
    from culprit

58
Belief Revision
  • Used to revise definitions of words with
    different sense from current meaning hypothesis
  • SNeBR (ATMS Martins Shapiro 88)
  • If inference leads to a contradiction, then
  • SNeBR asks user to remove culprit(s)
  • automatically removes consequences inferred
    from culprit
  • SNePSwD (SNePS w/ Defaults Martins Cravo 91)
  • Previously used to automate step 1, above
  • Now, legacy code
  • AutoBR (Johnson Shapiro, in progress ?)
  • new default reasoner (Bhushan Shapiro)
  • Will replace SNePSwD

59
Revision Expansion
  • Removal revision being automated via SNePSwD by
    ranking all propositions with kn_cat
  • most intrinsic info re language fundamental
    background info
  • certain (before is transitive)
  • story info in text (King Lot rode
    to town)
  • life background info w/o variables or
    inference
  • (dogs are animals)
  • story-comp info inferred from text (King
    Lot is a king, rode on a horse)
  • life-rule.1 everyday commonsense
    background info
  • (BearsLiveYoung(x) ? Mammal(x))
  • life-rule.2 specialized background info
  • (x smites y ? x kills y by
    hitting y)
  • least
  • certain questionable already-revised
    life-rule.2 not part of input

60
Belief Revision smite
  • Misunderstood word
  • Initially believe that smite meanskill by
    hitting
  • Read King Lot smote down King Arthur
  • Infer that King Arthur is dead
  • Then read King Arthur drew his sword Excalibur
  • Contradiction!
  • Weaken definition to hit and possibly kill
  • Then read more passages in which smiting ?gt
    killing
  • Hypothesize that smite means hit

61
Belief Revision smite
  • Misunderstood word 2-stage subtractive
    revision
  • Background knowledge includes
  • () smite(x,y,t) ? hit(x,y,t) dead(y,t)
    cause(hit(x,y,t),dead(y,t))
  • P1 King Lot smote down King Arthur
  • D1 If person x smites person y at time t, then x
    hits y at t, and y is dead at t
  • Q1 What properties does King Arthur have?
  • R1 King Arthur is dead.
  • P2 King Arthur drew Excalibur.
  • Q2 When did King Arthur do this?
  • SNeBR is invoked
  • KAs drawing E is inconsistent with being dead
  • () replaced smite(x,y,t) ? hit(x,y,t)
    ?dead(y,t) dead(y,t) ? cause(hit, dead)
  • D2 If person x smites person y at time t, then
    x hits y at t ?(y is dead at t)
  • P3 another passage in which (smiting ?
    death)
  • D3 If person x smites person y at time t, then
    x hits y at t

62
Belief Revision dress
  • Well-entrenched word
  • Believe dress means put clothes on
  • Commonsense belief
  • Spears dont wear clothing
  • used in new sense
  • Read King Claudius dressed his spear
  • Infer that spear wears clothing
  • Contradiction!
  • Modify definition to put clothes on OR
    something else
  • Read King Arthur dressed his troops before
    battle
  • Infer that dress means put clothes on OR
    prepare for battle
  • Eventually Induce more general definition
  • prepare (for the day, for battle, for eating)

63
Belief Revision dress
  • additive revision
  • Background info includes
  • dresses(x,y) ? ?zclothing(z) wears(y,z)
  • Spears dont wear clothing (both
    kn_catlife.rule.1)
  • P1 King Arthur dressed himself.
  • D1 A person can dress itself result it wears
    clothing.
  • P2 King Claudius dressed his spear.
  • Cassie infers King Claudiuss spear wears
    clothing.
  • Q2 What wears clothing?
  • SNeBR is invoked
  • KCs spear wears clothing inconsistent with (2).
  • (1) replaced dresses(x,y) ? ?zclothing(z)
    wears(y,z) v NEWDEF
  • Replace (1), not (2), because of verb in
    antecedent of (1) (Gentner)
  • P3 other passages in which dressing spears
    precedes fighting
  • D2 A person can dress a spear or a person
  • result person wears clothing or person
    is enabled to fight

64
CVA as Science Detection
  • CVA hypothesis generation testing
  • scientific task
  • develop theory of word meaning
  • not guessing, but
  • In science, guessing is called hypothesis
    formation (Loui)
  • detective work
  • finding clues
  • not who done it, but what does it mean
  • susceptible to revision upon further evidence

65
2 Problematic Assumptions
  • CVA assumes that
  • reader is consciously aware of the unfamiliar
    word
  • reader notes its unfamiliarity
  • CVA assumes that, between encounters
  • reader remembers the word
  • reader remembers hypothesized meaning

66
I. Are All Contexts Created Equal?
  • Beck, Isabel L. McKeown, Margaret G.
    McCaslin, Ellen S. (1983),
  • Vocabulary Development Not All Contexts Are
    Created Equal
  • Elementary School Journal 83(3) 177-181.
  • it is not true that every context is an
    appropriate or effective instructional means for
    vocabulary development

67
Role of Prior Knowledge
  • Beck et al
  • co-text can give clues to the words meaning
  • But clue is relative
  • clues need other info to be interpreted as clues
  • Implication A1
  • textual clues need to be supplemented with other
    information to compute a meaning.
  • Supplemental info readers prior knowledge
  • has to be available to reader
  • will be idiosyncratic

68
Do Words Have Unique, Correct Meanings?
  • Beck et al. ( others) assume
  • A2 A word has a unique meaning
  • A3 A word has a correct meaning
  • Contra unique A words meaning varies with
  • co-text
  • reader(s prior knowledge)
  • time of reading
  • Correct is a red herring (in any case, its
    fishy)
  • Possibly, words have author-intended meanings
  • but these need not be determined by co(n)text
  • Misunderstandings are universally unavoidable
  • Perfect understanding/dictionary definition not
    needed
  • satisficing understanding for passage
    comprehension suffices
  • reader always has opportunity of revising
    definition hypothesis

69
Beck et al.sCategories of Textual Contexts
  • What kinds of co-texts are helpful?
  • But keep in mind that we have different goals
  • Beck et al.
  • use co-text to teach correct word meanings
  • CCVA
  • use context to compute word meaning for
    understanding

70
Beck et al.s Textual Context Categories
Top-Level Kinds of Co-Text
  • Pedagogical co-texts
  • artificially constructed, designed for teaching
  • only example is for a verb
  • All the students made very good grades on the
    tests, so their teacher commended them for doing
    so well.
  • Natural co-texts
  • not intended to convey the meaning of a word
  • 4 kinds (actually, a continuum)

71
Beck et al.s Textual Context Categories1.
Misdirective (Natural) Co-Texts
  • seem to direct student to incorrect meaning for
    a word
  • sole example
  • Sandra had won the dance contest and the
    audiences cheers brought her to the stage for an
    encore. Every step she takes is so perfect and
    graceful, Ginny said grudgingly, as she watched
    Sandra dance.
  • grudgingly ? admiringly
  • Is this a natural context?
  • Is this all there is to it?..
  • A4 Co-texts have a fixed, usually small size
  • But larger co-text might add information
  • Prior knowledge can widen the co(n)text
  • grudgingly is an adverb!
  • A5 All words are equally easy to learn
  • But N easier than V, V easier than Adj/Adv!
    (Granger/Gentner/..Gleitman..)
  • A6 Only 1 co-text can be used.
  • But later co-texts can assist in refining meaning

72
Beck et al.s Textual Context Categories2.
Nondirective (Natural) Co-Texts
  • of no assistance in directing the reader toward
    any particular meaning for a word
  • sole example is for an adjective
  • Dan heard the door open and wondered who had
    arrived. He couldnt make out the voices. Then
    he recognized the lumbering footsteps on the
    stairs and knew it was Aunt Grace.
  • But
  • Is it natural?
  • What about larger co-text?
  • An adjective!
  • Of no assistance? (see next slide)

73
Syntactic Manipulation
  • Do misdirective nondirective contexts yield no
    (or only incorrect) information?
  • Cf. algebraic manipulation (brings x into focus)
  • 2x 1 7
  • x (7 - 1)/2 6/2 3
  • Syntactic manipulation (bring hard word into
    focus)
  • Every step she takes is so perfect and
    graceful, Ginny said grudgingly.
  • Grudgingly is the way that Ginny said
  • So, grudgingly is a way of saying something
  • In particular, grudgingly is a way of
    (apparently) praising someones performance
  • he recognized the lumbering footsteps on the
    stairs
  • lumbering is a property of footsteps on stairs

74
Beck et al.s Textual Context Categories3.
General (Natural) Co-Texts
  • provide enough information for reader to place
    word in a general category
  • sole example is for an adjective
  • Joe and Stan arrived at the party at 700. By
    930 the evening seemed to drag for Stan. But
    Joe really seemed to be having a good time at the
    party. I wish I could be as gregarious as he
    is, thought Stan
  • Same problems, but
  • adjective is contrasted with Stans attitude
  • contrasts are good (so are parallel
    constructions)

75
Beck et al.s Textual Context Categories4.
Directive (Natural) Co-Texts
  • seem likely to lead the student to a specific,
    correct meaning for a word
  • sole example is for a noun
  • When the cat pounced on the dog, he leapt up,
    yelping, and knocked over a shelf of books. The
    animals ran past Wendy, tripping her. She cried
    out and fell to the floor. As the noise and
    confusion mounted, Mother hollered upstairs,
    Whats all the commotion?
  • Natural? Long!
  • Noun!
  • note that the sole example of a directive context
    is a noun, suggesting that it might be the word
    that makes a context directive

76
Beck et al.s Experiment
  • Ss given passages from basal readers
  • Researchers categorized co-texts blacked out
    words
  • Ss asked to fill in the blanks with the missing
    words or reasonable synonyms
  • Results confirm 4 co-text types
  • Independently of results, there are
    methodological questions
  • Are basal readers natural contexts?
  • How large were co-texts?
  • Instruction on how to do CVA?
  • A7 CVA comes naturally, so needs no training
  • A8 Fill-in-the-blank tasks are a form of CVA
  • No, theyre not! (see next slide)

77
Beck et al.s Experiment CVA, Neologisms,
Fill-in-the-Blank
  • Serious methodological problem for all of us
  • What if S knows the unknown word?
  • Filter out such Ss and words?
  • hard to do what about testing familiar words?
  • Replace word with made-up neologism?
  • must be carefully chosen
  • Replace word with blank?
  • both kinds of replacement mislead S to find
    correct missing/hidden word
  • ? CVA!
  • Our (imperfect) solution
  • use plausible-sounding neologism
  • tell S its like a foreign word with no English
    equivalent, hence need a descriptive phrase

78
Beck et al.s Conclusion
  • less skilled readers receive little benefit
    from CVA
  • A9 CVA can only help in learning correct
    meanings.
  • But
  • CVA uses same techniques as general reading
    comprehension
  • careful, slow reading
  • careful analysis of text
  • directed search for information useful for
    computing a meaning
  • application of relevant prior knowledge
  • application of reasoning for purpose of
    extracting information from text
  • ?CVA, if properly taught practiced, can improve
    general reading comprehension

79
II. Are Context Clues Unreliable Predictors of
Word Meanings?
  • Schatz, Elinor Baldwin, R. Scott (1986)
  • Context Clues Are Unreliable Predictors of Word
    Meanings
  • Reading Research Quarterly 21(4) 439-453.
  • context does not usually provide clues to the
    meanings of low-frequency words
  • context clues inhibit the correct prediction of
    word meanings just as often as they facilitate
    them

80
SBs Argument
  • A10 CVA is not an efficient mechanism for
    inferring word meanings.
  • Because
  • Co-text cant help you figure out the correct
    meaning of an unfamiliar word.
  • (assumptions A2 A3 again!)
  • But
  • Wide context can help you figure out a meaning
    for an unfamiliar word.
  • So, context ( CVA) are efficient mechanisms for
    inferring (better computing) word meanings.

81
Incidental vs. Deliberate CVA
  • SB
  • context clues should help readers to infer
    meanings of words without the need for readers to
    interrupt the reading act() with diversions to
    external sources
  • () true for incidental CVA
  • () not for deliberate CVA
  • External sources are no solution anyway
  • Dictionary definitions are just more co-texts!
    (Schwartz 1988)
  • CVA is base case of recursion, one of whose
    recursive clauses is Look it up in a
    dictionary

82
  • Why not use a dictionary?
  • Because
  • People are lazy (!)
  • Dictionaries are not always available
  • Dictionaries are always incomplete
  • Dictionary definitions are not always useful
  • chaste df clean, spotless /? new dishes are
    chaste
  • college df a body of clergy living together
    and
  • supported by a foundation
  • Most words are learned via incidental CVA,
  • not via dictionaries
  • Most importantly
  • Dictionary definitions are just more contexts!

83
Why not use a dictionary?
  • Merriam-Webster New Collegiate Dictionary
  • chaste.
  • innocent of unlawful sexual intercourse
  • student stay away from that one!
  • celibate
  • student huh?
  • pure in thought and act modest
  • student I have to find a sentence for that?
  • a severely simple in design or execution
    austere
  • student huh? severely? austere?
  • b clean, spotless
  • student all right! The plates were still
    chaste after much use.
  • Deese 1967 / Miller 1985

84
Why not use a dictionary?
  • Merriam-Webster (continued)
  • college.
  • a body of clergy living together and supported by
    a foundation
  • a building used for an educational or religious
    purpose
  • a a self-governing constituent body of a
  • university offering living quarters and
  • instruction but not granting degrees
  • b a preparatory or high school
  • c an independent institution of higher
  • learning offering a course of general
  • studies leading to a bachelors
    degree
  • Problem ordering is historical!

85
Why not use a dictionary?
  • Merriam-Webster (continued)
  • infract infringe
  • infringe encroach
  • encroach
  • to enter by gradual steps or by stealth into the
    possessions or rights of another
  • to advance beyond the usual or proper limits
    trespass

86
Why not use a dictionary?
  • Collins COBUILD Dictionary
  • Helping Learners with Real English
  • chaste.
  • Someone who is chaste does not have sex with
    anyone, or only has sex with their husband or
    wife an old-fashioned use, used showing
    approval. EG She was a holy woman, innocent and
    chaste.
  • Something that is chaste is very simple in style,
    without much decoration. EG chaste houses built
    in 1732

87
Why not use a dictionary?
  • Collins COBUILD Dictionary
  • college.
  • A college is 1.1 an institution where students
    study for qualifications or do training courses
    after they have left school.
  • infract not in dictionary
  • infringe.
  • If you infringe a law or an agreement, you break
    it.
  • encroach.
  • To encroach on or upon something means to slowly
    take possession or control of it, so that someone
    else loses it bit by bit.

88
SBs Experiments
  • 25 natural passages from novels
  • words chosen (the only cited examples)
  • Adj/Adv 67
  • N 27
  • V 6
  • But
  • what are actual s?
  • which lexical categories were hardest?
  • how do facilitative/confounding co-texts
    correlate with lexical category?
  • should have had representative sample of 4
    co-text categories X 3 or 4 lexical categories

89
SBs ExperimentsCVA vs. Word-Sense
Disambiguation
  • 2 experiments
  • Ss chose correct meaning from list of 5
    possible meanings
  • This is WSD, not CVA!
  • WSD multiple choice
  • CVA essay question
  • 3rd experiment
  • real CVA, but interested only in full denotative
    meanings or accurate synonyms
  • cf. assumption A3 about correct meanings!

90
SBs Experiments Space Time Limits
  • Space limits on size of co-text?
  • SB 3 sentences
  • CCVA start small, work outward
  • Time limits on size of co-text?
  • SB all students finished in allotted time
  • CCVA no time limits

91
SBs Experiments Teaching CVA
  • SB did not control for Ss knowledge of how
    to use context clues
  • CCVA
  • deliberate CVA is a skill
  • needs to be taught, modeled, practiced
  • there is other (later) evidence that such
    training works
  • including critical thinking education

92
SBs 3 Questions(answered in the negative)
  • Do context clues occur with sufficient frequency
    to justify them as a major element of reading
    instruction?
  • Irrelevant if CVA fosters good reading
    comprehension critical thinking skills
  • Context clues do occur teaching them is
    justified, if augmented by readers prior
    knowledge knowledge of CVA skills.
  • Does context usually provide accurate clues to
    denotations connotations of low-frequency
    words?
  • Irrelevant under our conception of CVA accuracy
    not needed
  • CVA can provide clues to revisable hypotheses
    about unfamiliar words meaning
  • Are difficult words in natural co-texts
    usually amenable to such analysis?
  • Such words are always amenable to yielding at
    least some information about their meaning.

93
A Computational Theory of CVA
  • A word does not have a unique meaning.
  • A word does not have a correct meaning.
  • Authors intended meaning for word doesnt need
    to be known by readerin order for reader to
    understand word in context
  • Even familiar/well-known words can acquire new
    meanings in new contexts.
  • Neologisms are usually learned only from context
  • Every co-text can give some clue to a meaning for
    a word.
  • Generate initial hypothesis via
    syntactic/algebraic manipulation
  • But co-text must be integrated with readers
    prior knowledge
  • Large co-text large PK ? more clues
  • Lots of occurrences of word allow asymptotic
    approach to stable meaning hypothesis
  • CVA is computable
  • CVA is open-ended, hypothesis generation.
  • CVA ? guess missing word (cloze) ? CVA ?
    word-sense disambiguation
  • Some words are easier to compute meanings for
    than others (N lt V lt Adj/Adv)
  • CVA can improve general reading comprehension
    (through active reasoning)
  • CVA can should be taught in schools

94
State of the Art Computational Linguistics
  • Information extraction systems
  • Autonomous intelligent agents
  • There can be no complete lexicon
  • Such systems/agents shouldnt have to stop to ask
    questions

95
State of the Art Computational Linguistics
  • Granger 1977 Foul-Up
  • Based on Schanks theory of scripts (schema
    theory)
  • Our system not restricted to scripts
  • Zernik 1987 self-extending phrasal lexicon
  • Uses human informant
  • Ours system is really self-extending
  • Hastings 1994 Camille
  • Maps unknown word to known concept in ontology
  • Our system can learn new concepts
  • Word-Sense Disambiguation
  • Given ambiguous word list of all meanings,
    determine the correct meaning
  • Multiple-choice test ?
  • Our system given new word, compute its meaning
  • Essay question ?

96
State of the Art Vocabulary Learning (I)
  • Elshout-Mohr/van Daalen-Kapteijns 1981,1987
  • Application of Winstons AI arch learning
    theory
  • (Good) readers model of new word frame
  • Attribute slots, default values
  • Revision by updating slots values
  • Poor readers update by replacing entire frames
  • But EM vDK used
  • Made-up words
  • Carefully constructed contexts
  • Presented in a specific order

97
Elshout-Mohr van Daalen-Kapteijns
  • Experiments with neologisms in 5 artificial
    contexts
  • When you are used to a view it is depressing when
    you live in a room with kolpers.
  • Superordinate information
  • At home he had to work by artificial light
    because of those kolpers.
  • During a heat wave, people want kolpers, so
    sun-blind sales increase.
  • Contexts showing 2 differences from the
    superordinate
  • I was afraid the room might have kolpers, but
    plenty of sunlight came into it.
  • This house has kolpers all summer until the
    leaves fall out.
  • Contexts showing 2 counterexamples due to the 2
    differences

98
State of the Art Psychology
  • Johnson-Laird 1987
  • Word understanding ? definition
  • Definitions arent stored
  • During the Renaissance, Bernini cast a bronze of
    a mastiff eating truffles.

99
State of the Art Psychology
  • Sternberg et al. 1983,1987
  • Cues to look for ( slots for frame)
  • Spatiotemporal cues
  • Value cues
  • Properties
  • Functions
  • Cause/enablement information
  • Class memberships
  • Synonyms/antonyms
  • To acquire new words from context
  • Distinguish relevant/irrelevant information
  • Selectively combine relevant information
  • Compare this information with previous beliefs

100
Sternberg
  • The couple there on the blind date was not
    enjoying the festivities in the least. An
    acapnotic, he disliked her smoking and when he
    removed his hat, she, who preferred ageless
    men, eyed his increasing phalacrosis and
    grimaced.

101
From Algorithm to Curriculum
  • State of the art in vocabulary learning from
    context
  • Mauser 1984 context definition!
  • Clarke Nation 1980 a strategy (algorithm?)
  • Determine part of speech of word
  • Look at grammatical context
  • Who does what to whom?
  • Look at surrounding textual context
  • Search for clues (as we do)
  • Guess the word check your guess

102
CVA From Algorithm to Curriculum
  • guess the word
  • then a miracle occurs
  • Surely, computer scientists
  • can be more explicit!
  • And so should teachers!

103
Terminology Guessing?
  • Does reader
  • guess a meaning?!
  • not computational!
  • deduce a meaning?
  • too restrictive ignores other kinds of inference
  • infer a meaning?
  • too vague ignores other kinds of reasoning (cf.
    Herbert Simon)
  • figure out a meaning?
  • just vague enough?
  • My preference
  • The reader computes a meaning!

104
A More Precise, Teachable Algorithm
  • Treat guess as a procedure call
  • Fill in the details with our algorithm
  • Convert the algorithm into a curriculum
  • To enhance students abilities to use deliberate
  • CVA strategies

105
From Algorithm to Curriculum (contd)
  • We have explicit, GOF (symbolic) AI theory of how
    to do CVA
  • ? Teachable!
  • Goal
  • Not teach people to think like computers
  • But explicate computable teachable
    methods to hypothesize word meanings from
    context
  • AI as computational psychology
  • Devise computer programs that faithfully
    simulate(human) cognition
  • Can tell us something about (human) mind
  • Joint work with Michael Kibby (UB Reading Clinic)
  • We are teaching a machine, to see if what we
    learn in teaching it can help us teach students
    better

106
CVA From algorithm to curriculum
  • Treat guess as a procedure call (subroutine)
  • Fill in the details with our algorithm
  • Convert the algorithm into a curriculum
  • To enhance students abilities to use deliberate
    CVA strategies
  • To improve reading comprehension
  • and back again!
  • Use knowledge gained from CVA case studies to
    improve the algorithm
  • I.e., use Cassie to learn how to teach humans
  • use humans to learn how to teach Cassie

107
Problem in ConvertingAlgorithm into Curriculum
  • A knight picks up a brachet and carries it away
  • Cassie
  • Has perfect memory
  • Is perfect reasoner
  • Automatically infers that brachet is small
  • People dont always realize this
  • May need prompt
Write a Comment
User Comments (0)
About PowerShow.com