AVENUE/LETRAS: Learning-based MT Approaches for Languages with Limited Resources - PowerPoint PPT Presentation

About This Presentation
Title:

AVENUE/LETRAS: Learning-based MT Approaches for Languages with Limited Resources

Description:

How do we read all the stuff that they put online? MT for these languages would Enable: ... Bilingual Dictionary with Examples. 1,926 entries. Spelling ... – PowerPoint PPT presentation

Number of Views:47
Avg rating:3.0/5.0
Slides: 86
Provided by: AlonL
Learn more at: http://www.cs.cmu.edu
Category:

less

Transcript and Presenter's Notes

Title: AVENUE/LETRAS: Learning-based MT Approaches for Languages with Limited Resources


1
AVENUE/LETRASLearning-based MT Approaches for
Languages with Limited Resources
  • Alon Lavie, Jaime Carbonell, Lori Levin,
  • Bob Frederking
  • Joint work with
  • Erik Peterson, Christian Monson, Ariadna
    Font-Llitjos, Alison Alvarez, Roberto Aranovich

2
Why Machine Translation for Languages with
Limited Resources?
  • We are in the age of information explosion
  • The internetwebGoogle ? anyone can get the
    information they want anytime
  • But what about the text in all those other
    languages?
  • How do they read all this English stuff?
  • How do we read all the stuff that they put
    online?
  • MT for these languages would Enable
  • Better government access to native indigenous and
    minority communities
  • Better minority and native community
    participation in information-rich activities
    (health care, education, government) without
    giving up their languages.
  • Civilian and military applications (disaster
    relief)
  • Language preservation

3
AVENUE/LETRAS Funding
  • Started in 2000 with small amount of DARPA/TIDES
    funding (NICE)
  • AVENUE project funded by 5-year NSF ITR grant
    (2001-2006)
  • Follow-on LETRAS project funded by NSF HLC
    Program grant (2006-2009)
  • Collaboration funding sources
  • Mapudungun (MINEDUC, Chile)
  • Hebrew (ISF, Israel)
  • Brazilian Portuguese Native Langs. (Brazilian
    Gov.)
  • Inupiaq (NSF, Polar Programs)

4
CMUs AVENUE Approach
  • Elicitation use bilingual native informants to
    create a small high-quality word-aligned
    bilingual corpus of translated phrases and
    sentences
  • Building Elicitation corpora from feature
    structures
  • Feature Detection and Navigation
  • Transfer-rule Learning apply ML-based methods to
    automatically acquire syntactic transfer rules
    for translation between the two languages
  • Learn from major language to minor language
  • Translate from minor language to major language
  • XFER Decoder
  • XFER engine produces a lattice of possible
    transferred structures at all levels
  • Decoder searches and selects the best scoring
    combination
  • Rule Refinement refine the acquired rules via a
    process of interaction with bilingual informants
  • Morphology Learning
  • Word and Phrase bilingual lexicon acquisition

5
AVENUE MT Approach
  • Interlingua

Semantic Analysis
Sentence Planning
Syntactic Parsing
Text Generation
Transfer Rules
AVENUE Automate Rule Learning
Source (e.g. Quechua)
Target (e.g. English)
Direct SMT, EBMT
6
AVENUE Architecture
Elicitation
Rule Learning
Run-Time System
Rule Refinement
Morphology
Translation Correction Tool
Word-Aligned Parallel Corpus
INPUT TEXT
Run Time Transfer System
Rule Refinement Module
Elicitation Corpus
Decoder
Elicitation Tool
Lexical Resources
OUTPUT TEXT
7
Transfer Rule Formalism
SL the old man, TL ha-ish ha-zaqen NPNP
DET ADJ N -gt DET N DET ADJ ( (X1Y1) (X1Y3)
(X2Y4) (X3Y2) ((X1 AGR) 3-SING) ((X1 DEF
DEF) ((X3 AGR) 3-SING) ((X3 COUNT)
) ((Y1 DEF) DEF) ((Y3 DEF) DEF) ((Y2 AGR)
3-SING) ((Y2 GENDER) (Y4 GENDER)) )
  • Type information
  • Part-of-speech/constituent information
  • Alignments
  • x-side constraints
  • y-side constraints
  • xy-constraints,
  • e.g. ((Y1 AGR) (X1 AGR))

8
Transfer Rule Formalism (II)
SL the old man, TL ha-ish ha-zaqen NPNP
DET ADJ N -gt DET N DET ADJ ( (X1Y1) (X1Y3)
(X2Y4) (X3Y2) ((X1 AGR) 3-SING) ((X1 DEF
DEF) ((X3 AGR) 3-SING) ((X3 COUNT)
) ((Y1 DEF) DEF) ((Y3 DEF) DEF) ((Y2 AGR)
3-SING) ((Y2 GENDER) (Y4 GENDER)) )
  • Value constraints
  • Agreement constraints

9
Transfer Rules ? Transfer Trees
NP PP NP1 NP P Adj N
N1 ke eka aXyAya N
jIvana
NP NP1 PP Adj N
P NP one chapter of N1
N life
NP1 ke NP2 -gt NP2 of NP1 Ex jIvana ke
eka aXyAya life of (one) chapter
gt a chapter of life NP,12 NPNP PP
NP1 -gt NP1 PP ( (X1Y2) (X2Y1) ((x2
lexwx) 'kA') ) NP,13 NPNP NP1 -gt
NP1 ( (X1Y1) ) PP,12 PPPP NP Postp
-gt Prep NP ( (X1Y2) (X2Y1) )
10
Rule Learning - Overview
  • Goal Acquire Syntactic Transfer Rules
  • Use available knowledge from the source side
    (grammatical structure)
  • Three steps
  • Flat Seed Generation first guesses at transfer
    rules flat syntactic structure
  • Compositionality Learning use previously learned
    rules to learn hierarchical structure
  • Constraint Learning refine rules by learning
    appropriate feature constraints

11
Flat Seed Rule Generation
Learning Example NP Eng the big apple Heb ha-tapuax ha-gadol
Generated Seed Rule NPNP ART ADJ N ? ART N ART ADJ ((X1Y1) (X1Y3) (X2Y4) (X3Y2))
12
Compositionality Learning
Initial Flat Rules SS ART ADJ N V ART N ? ART N ART ADJ V P ART N ((X1Y1) (X1Y3) (X2Y4) (X3Y2) (X4Y5) (X5Y7) (X6Y8)) NPNP ART ADJ N ? ART N ART ADJ ((X1Y1) (X1Y3) (X2Y4) (X3Y2)) NPNP ART N ? ART N ((X1Y1) (X2Y2))
Generated Compositional Rule SS NP V NP ? NP V P NP ((X1Y1) (X2Y2) (X3Y4))
13
Constraint Learning
Input Rules and their Example Sets SS NP V NP ? NP V P NP ex1,ex12,ex17,ex26 ((X1Y1) (X2Y2) (X3Y4)) NPNP ART ADJ N ? ART N ART ADJ ex2,ex3,ex13 ((X1Y1) (X1Y3) (X2Y4) (X3Y2)) NPNP ART N ? ART N ex4,ex5,ex6,ex8,ex10,ex11 ((X1Y1) (X2Y2))
Output Rules with Feature Constraints SS NP V NP ? NP V P NP ((X1Y1) (X2Y2) (X3Y4) (X1 NUM X2 NUM) (Y1 NUM Y2 NUM) (X1 NUM Y1 NUM))
14
AVENUE Prototypes
  • General XFER framework under development for past
    three years
  • Prototype systems so far
  • German-to-English, Dutch-to-English
  • Chinese-to-English
  • Hindi-to-English
  • Hebrew-to-English
  • Portuguese-to-English
  • In progress or planned
  • Mapudungun-to-Spanish
  • Quechua-to-Spanish
  • Inupiaq-to-English
  • Native-Brazilian languages to Brazilian Portuguese

15
Mapudungun
  • Indigenous Language of Chile and Argentina
  • 1 Million Mapuche Speakers

16
Collaboration
Eliseo Cañulef Rosendo Huisca Hugo Carrasco
Hector Painequeo Flor Caniupil Luis Caniupil
Huaiquiñir Marcela Collio Calfunao Cristian
Carrillan Anton Salvador Cañulef
  • Mapuche Language Experts
  • Universidad de la Frontera (UFRO)
  • Instituto de Estudios Indígenas (IEI)
  • Institute for Indigenous Studies
  • Chilean Funding
  • Chilean Ministry of Education (Mineduc)
  • Bilingual and Multicultural Education Program

Carolina Huenchullan Arrúe Claudio Millacura
Salas
17
Accomplishments
  • Corpora Collection
  • Spoken Corpus
  • Collected Luis Caniupil Huaiquiñir
  • Medical Domain
  • 3 of 4 Mapudungun Dialects
  • 120 hours of Nguluche
  • 30 hours of Lafkenche
  • 20 hours of Pwenche
  • Transcribed in Mapudungun
  • Translated into Spanish
  • Written Corpus
  • 200,000 words
  • Bilingual Mapudungun Spanish
  • Historical and newspaper text

nmlch-nmjm1_x_0405_nmjm_00 M ltSPAgtno pütokovilu
kay ko C no, si me lo tomaba con agua M
chumgechi pütokoki femuechi pütokon pu ltNoisegt
C como se debe tomar, me lo tomé
pués nmlch-nmjm1_x_0406_nmlch_00 M
Chengewerkelafuymiürke C Ya no estabas como
gente entonces!
18
Accomplishments
  • Developed At UFRO
  • Bilingual Dictionary with Examples
  • 1,926 entries
  • Spelling Corrected Mapudungun Word List
  • 117,003 fully-inflected word forms
  • Segmented Word List
  • 15,120 forms
  • Stems translated into Spanish

19
Accomplishments
  • Developed at LTI using Mapudungun language
    resources from UFRO
  • Spelling Checker
  • Integrated into OpenOffice
  • Hand-built Morphological Analyzer
  • Prototype Machine Translation Systems
  • Rule-Based
  • Example-Based
  • Website LenguasAmerindias.org

20
Challenges for Hebrew MT
  • Paucity in existing language resources for Hebrew
  • No publicly available broad coverage
    morphological analyzer
  • No publicly available bilingual lexicons or
    dictionaries
  • No POS-tagged corpus or parse tree-bank corpus
    for Hebrew
  • No large Hebrew/English parallel corpus
  • Scenario well suited for CMU transfer-based MT
    framework for languages with limited resources

21
Hebrew-to-English MT Prototype
  • Initial prototype developed within a two month
    intensive effort
  • Accomplished
  • Adapted available morphological analyzer
  • Constructed a preliminary translation lexicon
  • Translated and aligned Elicitation Corpus
  • Learned XFER rules
  • Developed (small) manual XFER grammar as a point
    of comparison
  • System debugging and development
  • Evaluated performance on unseen test data using
    automatic evaluation metrics

22
(No Transcript)
23
Morphology Example
  • Input word BWRH
  • 0 1 2 3 4
  • --------BWRH--------
  • -----B-----WR--H--
  • --B---H----WRH---

24
Morphology Example
  • Y0 ((SPANSTART 0) Y1 ((SPANSTART 0)
    Y2 ((SPANSTART 1)
  • (SPANEND 4) (SPANEND
    2) (SPANEND 3)
  • (LEX BWRH) (LEX B)
    (LEX WR)
  • (POS N) (POS
    PREP)) (POS N)
  • (GEN F)
    (GEN M)
  • (NUM S)
    (NUM S)
  • (STATUS ABSOLUTE))
    (STATUS ABSOLUTE))
  • Y3 ((SPANSTART 3) Y4 ((SPANSTART 0)
    Y5 ((SPANSTART 1)
  • (SPANEND 4) (SPANEND
    1) (SPANEND 2)
  • (LEX LH) (LEX
    B) (LEX H)
  • (POS POSS)) (POS
    PREP)) (POS DET))
  • Y6 ((SPANSTART 2) Y7 ((SPANSTART 0)
  • (SPANEND 4) (SPANEND
    4)
  • (LEX WRH) (LEX
    BWRH)
  • (POS N) (POS
    LEX))
  • (GEN F)
  • (NUM S)

25
Example Translation
  • Input
  • ???? ?????? ???? ?????? ?????? ????? ???? ??
    ????? ??????
  • After debates many decided the government to hold
    referendum in issue the withdrawal
  • Output
  • AFTER MANY DEBATES THE GOVERNMENT DECIDED TO HOLD
    A REFERENDUM ON THE ISSUE OF THE WITHDRAWAL

26
Sample Output (dev-data)
  • maxwell anurpung comes from ghana for israel four
    years ago and since worked in cleaning in hotels
    in eilat
  • a few weeks ago announced if management club
    hotel that for him to leave israel according to
    the government instructions and immigration
    police
  • in a letter in broken english which spread among
    the foreign workers thanks to them hotel for
    their hard work and announced that will purchase
    for hm flight tickets for their countries from
    their money

27
Challenges and Future Directions
  • Automatic Transfer Rule Learning
  • Learning mappings for non-compositional
    structures
  • Effective models for rule scoring for
  • Decoding using scores at runtime
  • Pruning the large collections of learned rules
  • Learning Unification Constraints
  • In the absence of morphology or POS annotated
    lexica
  • Integrated Xfer Engine and Decoder
  • Improved models for scoring tree-to-tree
    mappings, integration with LM and other knowledge
    sources in the course of the search

28
Challenges and Future Directions
  • Our approach for learning transfer rules is
    applicable to the large parallel data scenario,
    subject to solutions for several big challenges
  • No elicitation corpus ? break-down parallel
    sentences into reasonable learning examples
  • Working with less reliable automatic word
    alignments rather than manual alignments
  • Effective use of reliable parse structures for
    ONE language (i.e. English) and automatic word
    alignments in order to decompose the translation
    of a sentence into several compositional rules.
  • Effective scoring of resulting very large
    transfer grammars, and scaled up transfer
    decoding

29
Future Research Directions
  • Automatic Rule Refinement
  • Morphology Learning
  • Feature Detection and Corpus Navigation

30
Implications for MT with Vast Amounts of Parallel
Data
  • Phrase-to-phrase MT ill suited for long-range
    reorderings ? ungrammatical output
  • Recent work on hierarchical Stat-MT Chiang,
    2005 and parsing-based MT Melamed et al, 2005
    Knight et al
  • Learning general tree-to-tree syntactic mappings
    is equally problematic
  • Meaning is a hybrid of complex, non-compositional
    phrases embedded within a syntactic structure
  • Some constituents can be translated in isolation,
    others require contextual mappings

31
Evaluation Results
  • Test set of 62 sentences from Haaretz newspaper,
    2 reference translations

System BLEU NIST P R METEOR
No Gram 0.0616 3.4109 0.4090 0.4427 0.3298
Learned 0.0774 3.5451 0.4189 0.4488 0.3478
Manual 0.1026 3.7789 0.4334 0.4474 0.3617
32
Hebrew-English Test Suite Evaluation
Grammar BLEU METEOR
Baseline (NoGram) 0.0996 0.4916
Learned Grammar 0.1608 0.5525
Manual Grammar 0.1642 0.5320
33
Quechua?Spanish MT
  • V-Unit funded Summer project in Cusco (Peru)
    June-August 2005 preparations and data
    collection started earlier
  • Intensive Quechua course in Centro Bartolome de
    las Casas (CBC)
  • Worked together with two Quechua native and one
    non-native speakers on developing infrastructure
    (correcting elicited translations, segmenting and
    translating list of most frequent words)

34
Quechua ? Spanish Prototype MT System
  • Stem Lexicon (semi-automatically generated) 753
    lexical entries
  • Suffix lexicon 21 suffixes
  • (150 Cusihuaman)
  • Quechua morphology analyzer
  • 25 translation rules
  • Spanish morphology generation module
  • User-Studies 10 sentences, 3 users (2 native, 1
    non-native)

35
The Transfer Engine
Analysis Source text is parsed into its grammatical structure. Determines transfer application ordering. Example ? ? ??(he read book) S NP VP N V NP ? ? ? Transfer A target language tree is created by reordering, insertion, and deletion. S NP VP N V NP he read DET N a book Article a is inserted into object NP. Source words translated with transfer lexicon. Generation Target language constraints are checked and final translation produced. E.g. reads is chosen over read to agree with he. Final translation He reads a book
36
The Transfer Engine
  • Some Unique Features
  • Works with either learned or manually-developed
    transfer grammars
  • Handles rules with or without unification
    constraints
  • Supports interfacing with servers for
    Morphological analysis and generation
  • Can handle ambiguous source-word analyses and/or
    SL segmentations represented in the form of
    lattice structures

37
The Lattice Decoder
  • Simple Stack Decoder, similar in principle to
    SMT/EBMT decoders
  • Searches for best-scoring path of non-overlapping
    lattice arcs
  • Scoring based on log-linear combination of
    scoring components (no MER training yet)
  • Scoring components
  • Standard trigram LM
  • Fragmentation how many arcs to cover the entire
    translation?
  • Length Penalty
  • Rule Scores (not fully integrated yet)

38
Outline
  • Rationale for learning-based MT
  • Roadmap for learning-based MT
  • Framework overview
  • Elicitation
  • Learning transfer Rules
  • Automatic rule refinement
  • Example prototypes
  • Implications for MT with vast parallel data
  • Conclusions and future directions

39
Data Elicitation for Languages with Limited
Resources
  • Rationale
  • Large volumes of parallel text not available ?
    create a small maximally-diverse parallel corpus
    that directly supports the learning task
  • Bilingual native informant(s) can translate and
    align a small pre-designed elicitation corpus,
    using elicitation tool
  • Elicitation corpus designed to be typologically
    and structurally comprehensive and compositional
  • Transfer-rule engine and new learning approach
    support acquisition of generalized transfer-rules
    from the data

40
Elicitation Tool English-Chinese Example
41
Elicitation ToolEnglish-Chinese Example
42
Elicitation ToolEnglish-Hindi Example
43
Elicitation ToolEnglish-Arabic Example
44
Elicitation ToolSpanish-Mapudungun Example
45
Designing Elicitation Corpora
  • What do we want to elicit?
  • Diversity of linguistic phenomena and
    constructions
  • Syntactic structural diversity
  • How do we construct an elicitation corpus?
  • Typological Elicitation Corpus based on
    elicitation and documentation work of field
    linguists (e.g. Comrie 1977, Bouquiaux 1992)
    initial corpus size 1000 examples
  • Structural Elicitation Corpus based on
    representative sample of English phrase
    structures 120 examples
  • Organized compositionally elicit simple
    structures first, then use them as building
    blocks
  • Goal minimize size, maximize linguistic coverage

46
Typological Elicitation Corpus
  • Feature Detection
  • Discover what features exist in the language and
    where/how they are marked
  • Example does the language mark gender of nouns?
    How and where are these marked?
  • Method compare translations of minimal pairs
    sentences that differ in only ONE feature
  • Elicit translations/alignments for detected
    features and their combinations
  • Dynamic corpus navigation based on feature
    detection no need to elicit for combinations
    involving non-existent features

47
Typological Elicitation Corpus
  • Initial typological corpus of about 1000
    sentences was manually constructed
  • New construction methodology for building an
    elicitation corpus using
  • A feature specification lists inventory of
    available features and their values
  • A definition of the set of desired feature
    structures
  • Schemas define sets of desired combinations of
    features and values
  • Multiplier algorithm generates the comprehensive
    set of feature structures
  • A generation grammar and lexicon NLG generator
    generates NL sentences from the feature structures

48
Structural Elicitation Corpus
  • Goal create a compact diverse sample corpus of
    syntactic phrase structures in English in order
    to elicit how these map into the elicited
    language
  • Methodology
  • Extracted all CFG rules from Brown section of
    Penn TreeBank (122K sentences)
  • Simplified POS tag set
  • Constructed frequency histogram of extracted
    rules
  • Pulled out simplest phrases for most frequent
    rules for NPs, PPs, ADJPs, ADVPs, SBARs and
    Sentences
  • Some manual inspection and refinement
  • Resulting corpus of about 120 phrases/sentences
    representing common structures
  • See Probst and Lavie, 2004

49
Outline
  • Rationale for learning-based MT
  • Roadmap for learning-based MT
  • Framework overview
  • Elicitation
  • Learning transfer Rules
  • Automatic rule refinement
  • Example prototypes
  • Implications for MT with vast parallel data
  • Conclusions and future directions

50
Flat Seed Rule Generation
  • Create a flat transfer rule specific to the
    sentence pair, partially abstracted to POS
  • Words that are aligned word-to-word and have the
    same POS in both languages are generalized to
    their POS
  • Words that have complex alignments (or not the
    same POS) remain lexicalized
  • One seed rule for each translation example
  • No feature constraints associated with seed rules
    (but mark the example(s) from which it was
    learned)

51
Compositionality Learning
  • Detection traverse the c-structure of the
    English sentence, add compositional structure for
    translatable chunks
  • Generalization adjust constituent sequences and
    alignments
  • Two implemented variants
  • Safe Compositionality there exists a transfer
    rule that correctly translates the
    sub-constituent
  • Maximal Compositionality Generalize the rule if
    supported by the alignments, even in the absence
    of an existing transfer rule for the
    sub-constituent

52
Constraint Learning
  • Goal add appropriate feature constraints to the
    acquired rules
  • Methodology
  • Preserve general structural transfer
  • Learn specific feature constraints from example
    set
  • Seed rules are grouped into clusters of similar
    transfer structure (type, constituent sequences,
    alignments)
  • Each cluster forms a version space a partially
    ordered hypothesis space with a specific and a
    general boundary
  • The seed rules in a group form the specific
    boundary of a version space
  • The general boundary is the (implicit) transfer
    rule with the same type, constituent sequences,
    and alignments, but no feature constraints

53
Constraint Learning Generalization
  • The partial order of the version space
  • Definition A transfer rule tr1 is strictly more
    general than another transfer rule tr2 if all
    f-structures that are satisfied by tr2 are also
    satisfied by tr1.
  • Generalize rules by merging them
  • Deletion of constraint
  • Raising two value constraints to an agreement
    constraint, e.g.
  • ((x1 num) pl), ((x3 num) pl) ?
  • ((x1 num) (x3 num))

54
Automated Rule Refinement
  • Bilingual informants can identify translation
    errors and pinpoint the errors
  • A sophisticated trace of the translation path can
    identify likely sources for the error and do
    Blame Assignment
  • Rule Refinement operators can be developed to
    modify the underlying translation grammar (and
    lexicon) based on characteristics of the error
    source
  • Add or delete feature constraints from a rule
  • Bifurcate a rule into two rules (general and
    specific)
  • Add or correct lexical entries
  • See Font-Llitjos, Carbonell Lavie, 2005

55
Outline
  • Rationale for learning-based MT
  • Roadmap for learning-based MT
  • Framework overview
  • Elicitation
  • Learning transfer Rules
  • Automatic rule refinement
  • Example prototypes
  • Implications for MT with vast parallel data
  • Conclusions and future directions

56
Outline
  • Rationale for learning-based MT
  • Roadmap for learning-based MT
  • Framework overview
  • Elicitation
  • Learning transfer Rules
  • Automatic rule refinement
  • Learning Morphology
  • Example prototypes
  • Implications for MT with vast parallel data
  • Conclusions and future directions

57
Implications for MT with Vast Amounts of Parallel
Data
  • Example
  • ? ?? ? ??? ?? ? ??
  • He freq with J Zemin Pres via
    phone
  • He freq talked with President J Zemin over
    the phone

58
Implications for MT with Vast Amounts of Parallel
Data
  • Example
  • ? ?? ? ??? ?? ? ??
  • He freq with J Zemin Pres via
    phone
  • He freq talked with President J Zemin over
    the phone

NP1
NP2
NP3
NP1
NP2
NP3
59
Conclusions
  • There is hope yet for wide-spread MT between many
    of the worlds language pairs
  • MT offers a fertile yet extremely challenging
    ground for learning-based approaches that
    leverage from diverse sources of information
  • Syntactic structure of one or both languages
  • Word-to-word correspondences
  • Decomposable units of translation
  • Statistical Language Models
  • AVENUEs XFER approach provides a feasible
    solution to MT for languages with limited
    resources
  • Promising approach for addressing the fundamental
    weaknesses in current corpus-based MT for
    languages with vast resources

60
(No Transcript)
61
Mapudungun-to-Spanish Example
English I didnt see Maria
Mapudungun pelafiñ Maria
Spanish No vi a María
62
Mapudungun-to-Spanish Example
English I didnt see Maria
Mapudungun pelafiñ Maria pe -la -fi -ñ Maria see
-neg -3.obj -1.subj.indicative Maria
Spanish No vi a María No vi a María neg see.1.sub
j.past.indicative acc Maria
63
pe-la-fi-ñ Maria
V
pe
64
pe-la-fi-ñ Maria
V
pe
VSuff
Negation
la
65
pe-la-fi-ñ Maria
V
pe
VSuffG
Pass all features up
VSuff
la
66
pe-la-fi-ñ Maria
V
pe
VSuffG
VSuff
object person 3
fi
VSuff
la
67
pe-la-fi-ñ Maria
V
VSuffG
pe
Pass all features up from both children
VSuffG
VSuff
fi
VSuff
la
68
pe-la-fi-ñ Maria
V
VSuffG
VSuff
pe
person 1 number sg mood ind
VSuffG
VSuff
ñ
fi
VSuff
la
69
pe-la-fi-ñ Maria
V
VSuffG
VSuffG
VSuff
pe
Pass all features up from both children
VSuffG
VSuff
ñ
fi
VSuff
la
70
pe-la-fi-ñ Maria
Pass all features up from both children
V
Check that 1) negation 2) tense is undefined
V
VSuffG
VSuffG
VSuff
pe
VSuffG
VSuff
ñ
fi
VSuff
la
71
pe-la-fi-ñ Maria
NP
V
VSuffG
person 3 number sg human
VSuffG
VSuff
N
pe
VSuffG
VSuff
Maria
ñ
fi
VSuff
la
72
pe-la-fi-ñ Maria
S
Check that NP is human
Pass features up from
VP
NP
V
VSuffG
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
73
Transfer to Spanish Top-Down
S
S
VP
VP
NP
V
VSuffG
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
74
Transfer to Spanish Top-Down
Pass all features to Spanish side
S
S
VP
VP
NP
NP
a
V
VSuffG
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
75
Transfer to Spanish Top-Down
S
S
Pass all features down
VP
VP
NP
NP
a
V
VSuffG
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
76
Transfer to Spanish Top-Down
S
S
Pass object features down
VP
VP
NP
NP
a
V
VSuffG
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
77
Transfer to Spanish Top-Down
S
S
VP
VP
NP
NP
a
V
VSuffG
Accusative marker on objects is introduced
because human
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
78
Transfer to Spanish Top-Down
S
S
VP
VP
VPVP VBar NP -gt VBar "a" NP ( (X1Y1) (X2
Y3) ((X2 type) (NOT personal)) ((X2
human) c ) (X0 X1) ((X0 object) X2)
(Y0 X0) ((Y0 object) (X0 object)) (Y1
Y0) (Y3 (Y0 object)) ((Y1 objmarker person)
(Y3 person)) ((Y1 objmarker number) (Y3
number)) ((Y1 objmarker gender) (Y3 ender)))
NP
NP
a
V
VSuffG
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
79
Transfer to Spanish Top-Down
S
S
Pass person, number, and mood features to Spanish
Verb
VP
VP
NP
NP
a
Assign tense past
V
VSuffG
V
no
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
80
Transfer to Spanish Top-Down
S
S
VP
VP
NP
NP
a
V
VSuffG
V
no
VSuffG
VSuff
N
pe
VSuffG
VSuff
ñ
Maria
Introduced because negation
fi
VSuff
la
81
Transfer to Spanish Top-Down
S
S
VP
VP
NP
NP
a
V
VSuffG
V
no
VSuffG
VSuff
N
pe
ver
VSuffG
VSuff
ñ
Maria
fi
VSuff
la
82
Transfer to Spanish Top-Down
S
S
VP
VP
NP
NP
a
V
VSuffG
V
no
VSuffG
VSuff
N
pe
ver
vi
VSuffG
VSuff
ñ
Maria
person 1 number sg mood indicative tense
past
fi
VSuff
la
83
Transfer to Spanish Top-Down
S
S
Pass features over to Spanish side
VP
VP
NP
NP
a
V
VSuffG
V
no
VSuffG
VSuff
N
pe
vi
N
VSuffG
VSuff
ñ
Maria
María
fi
VSuff
la
84
I Didnt see Maria
S
S
VP
VP
NP
NP
a
V
VSuffG
V
no
VSuffG
VSuff
N
pe
vi
N
VSuffG
VSuff
ñ
Maria
María
fi
VSuff
la
85
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com