Title: Translating Between Conceptual Systems Robert Goldstone Brian Rogosky Indiana University Department
1Translating Between Conceptual SystemsRobert
GoldstoneBrian RogoskyIndiana
UniversityDepartment of PsychologyProgram in
Cognitive Science
2How do concepts get their meaning?
- Conceptual web
- A concepts meaning comes from its connections to
other concepts in the same conceptual system - External grounding
- A concepts meaning comes from its connection to
the external world
3The Conceptual Web
- Philosophy
- Conceptual role semantics (Block, 1986 Field,
1977) - Conceptual incommensurability (Kuhn, 1962)
- Psychology
- Isolated and interrelated concepts (Goldstone,
1996) - Latent semantic analysis (Landauer Dumais,
1997) - Computer Science
- Semantic networks (Collins Quillian, 1969)
- Intrinsic meaning in large databases (Lenat
Feigenbaum, 1991)
4The Conceptual Web in Linguistics
Language is a system of interdependent terms in
which the value of each term results solely from
the simultaneous presence of other terms in the
system.
Concepts are purely differential and defined not
in terms of their positive content, but
negatively by their relations with other terms in
the system. (Ferdinand de Saussure, 1915)
Contrast sets spaghetti, linguine, fettuccine
5Prototypical Face (Steyvers, 1999)
6Particular Face
7Caricature of Particular Face Away from Prototype
8(No Transcript)
9Intrinsic meaning in large databases
The problem of genuine semantics gets easier,
not harder, as the Knowledge Base grows. In
the case of an enormous KB, such as CYCs, for
example, we could rename all of the frames and
predicates as G001, G002, and - using our
knowledge of the world - reconstruct what each of
their names must be (Lenat Feigenbaum,
1991, p. 236)
10Externally grounded concepts
- Philosophy
- The symbol grounding problem
- Suppose you had to learn Chinese as a first
language and the only source of information you
had was a Chinese/Chinese dictionary . How
can you ever get off the symbol/symbol
merry-go-round? How is symbol meaning to be
grounded in something other than just more
meaningless symbols? This is the symbol
grounding problem. (Harnad, 1991) - Psychology
- Perceptual symbol systems (Barsalou, 1999)
- Computer science
- Embodied cognition (Brooks, 1991)
11Translation across conceptual systems
- How can we determine that two people share a
matching concept of something (such as Mushroom)? - The publicity of concepts we want to say that
two people both have a concept of Mushroom even
though they know different things (Fodor, 1998) - Cross-person translation as a challenge to
conceptual web accounts of meaning (Fodor
Lepore, 1992) - If a concepts meaning depends on its role in its
system, and if two people have different systems,
then they cant have the same meaning
12Fodors (1998) argument against conceptual web
accounts of meaning
- One cannot salvage a conceptual web account by
using similarity, rather than identity, of
systems to translate concepts - The similarity of our GW concepts is thus some
(presumably weighted) function of the number of
propositions about him that we both believe
But the question now arises what about the
shared beliefs themselves are they or arent
they literally shared? This poses a dilemma for
the similarity theorist that is, as far as I can
see, unavoidable. If he says that our agreed
upon beliefs about GW are literally shared, then
he hasnt managed to do what he promised viz.
introduce a notion of similarity of content that
dispenses with a robust notion of publicity. But
if he says that the agreed beliefs arent
literally identical (viz. that they are only
required to be similar), then his account of
content similarity begs the very question it was
supposed to answer his way of saying what it is
for concepts to have similar, but not identical
contents presupposes a prior notion of beliefs
with similar but not identical concepts (F
odor, 1998).
13The ABSURDIST Algorithm(Aligning Between Systems
Using Relations Derived Inside Systems Themselves)
- Translation across systems is possible using only
within-system relations - Two concepts can correspond to each other even if
they are different - Contra Fodor, it is possible to go from
similarities to matching concepts - Purposefully impoverished conceptual
representation - Concepts defined only by their similarities to
other concepts in same system - A persons conceptual network is represented as a
matrix of similarities - Not a realistic representation, but most
challenging for a conceptual web account - Within-system relations are sufficient for
cross-system translation, but external grounding
and internal relations increase each others
power
14The Computational Task for ABSURDIST
- Input two similarity matrices
- Outputa set of alignments between the matrices
- One node for each possible translation between
elements of two systems - With processing, one consistent set of nodes will
be activated
15z
q
y
r
s
x
A
B
Objects q and x enter into similar similarity
relations (distances) to other objects within
their systems
16z
q
y
r
s
x
q is aligned with x in part because qs
similarity to r is similar to xs similarity to
y. But doesnt this assume that r (and not s)
corresponds to y? Both C(r,y) and C(s,y)
facilitate C(q,x) But C(r,y) facilitates C(q,x)
more because it is more active All
correspondences must develop simultaneously
17Ct(Aq,Bx)Unit that places object q from A into
correspondence with object x from
B N(Ct(Aq,Bx))Net input to this correspondence
unit
E is the external similarity between Aq and Bx R
is their internal similarity I is the inhibition
to placing Aq and Bx into correspondence
?????1. ??0 for first simulation
18Internal excitation if correspondences are
consistent and supportive
D(Aq,Ar) distance between elements q and r in
System A S(E,F) similarity between distances E
and F
Internal inhibition if correspondences are
inconsistent (2-to-1)
19Testing ABSURDIST
- Create conceptual webs for two people
- Create a set of N concepts in Person A
- Each concept is a position in a two dimensional
space - Metric constraint not required by algorithm, but
useful for visualization - Copy these concepts to Person B
- Add noise to Bs concepts
- Measure ABSURDISTs ability to recover true
alignments - 1000 separate runs
- Initialize correspondence units to 0.5
- Activation passing for a set number of iterations
- Any concepts connected by a unit with more than
0.95 activity are assumed to be aligned
20Probability of Recovering All Correct
Correspondences
For moderate noise levels, finding translations
improves as the number of elements per system
increases
21Frequency
ABSURDIST tends to get either all correspondences
correct, or none
22Probability of Recovering All Correct
Correspondences
Performance does not improve much after 2000
iterations, irrespective of the number of
elements per system
23Even if two objects enter into the identical
similarity relations, they can be correctly
distinguished based on indirect similarities
24Integrating External and Internal Determinants of
Meaning
q
z
y
r
s
x
A
B
25Percent Correct Correspondences
Seeding one correspondence helps more than just
that one correspondence The more elements per
system, the bigger the influence of seeding one
correspondence relative to what is expected by
promoting just that one correspondence
26Integrating Extrinsic and Intrinsic Determinants
of Meaning
Intrinsic meaning of Aq is its similarity to
other elements of A
Extrinsic meaning of Aq is its absolute
coordinates
Extrinsic only
Intrinsic only
Intrinsic Extrinsic
27Probability of Recovering All Correct
Correspondences
Integrating intrinsic and extrinsic influences
produces better translations than either method
by itself
28Applications of ABSURDIST
- Translating best-matching parts of a system
- Object recognition
- Within-object relations provide a strong
constraint for aligning objects (Edelman, 1999) - Analogical reasoning and similarity
- Most models of analogy require highly structured,
propositional representations (Eliasmith
Thagard, 2001 Falkenhainer et al, 1989 Holyoak
Thagard, 1989 Hummel Holyoak, 1997) - ABSURDIST can be applied when similarities are
known, but structured representations are hard to
find pictures, words, etc. - Translating across large databases
(dictionaries,thesauri, etc.)
29Translating Between Systems with Different Sizes
System A
System B
A consistent subset of the larger system is
mapped onto to the smaller system
30Application to Object Recognition
Object to be recognized
Stored Object
31Conclusions
- Conceptual web accounts of conceptual translation
are not viciously circular - Connecting concepts to both the world and each
other is an attractive option - These connections are mutually supportive, not
antagonistic - Within-system relational information may have
surprisingly large influences