Learningbased MT Approaches for Languages with Limited Resources
Learning-based MT Approaches for Languages with Limited Resources Alon Lavie Language Technologies Institute Carnegie Mellon University Joint work with: Jaime Carbonell, Lori Levin, Kathrin Probst, Erik Peterson, Christian Monson, Ariadna Font-Llitjos, Alison Alvarez, Roberto Aranovich May 11, 2006 Learning-based MT with Limited Resources 1
Outline • Rationale for limited-resource learning-based MT • Roadmap for limited-resource learning-based MT • Framework overview • Elicitation • Learning transfer Rules • Automatic rule refinement • Example prototypes • Implications for MT with vast parallel data • Conclusions and future directions May 11, 2006 Learning-based MT with Limited Resources 2
Why Machine Translation for Languages with Limited Resources? • We are in the age of information explosion – The internet+web+Google anyone can get the information they want anytime… • But what about the text in all those other languages? – How do they read all this English stuff? – How do we read all the stuff that they put online? • MT for these languages would Enable: – Better government access to native indigenous and minority communities – Better minority and native community participation in information-rich activities (health care, education, government) without giving up their languages. – Civilian and military applications (disaster relief) – Language preservation May 11, 2006 Learning-based MT with Limited Resources 3
The Roadmap to Learning-based MT • Automatic acquisition of necessary language resources and knowledge using machine learning methodologies: – Learning morphology (analysis/generation) – Rapid acquisition of broad coverage word-to-word and phrase-to-phrase translation lexicons – Learning of syntactic structural mappings • Tree-to-tree and string-to-tree structure transformations [Knight et al], [Eisner], [Melamed] • Learning syntactic transfer rules with resources (grammar, parses) for just one of the two languages – Automatic rule refinement and/or post-editing • A framework for integrating the acquired MT resources into effective MT prototype systems • Effective integration of acquired knowledge with statistical/distributional information May 11, 2006 Learning-based MT with Limited Resources 4
CMU’s AVENUE Approach • Elicitation: use bilingual native informants to produce a small high-quality word-aligned bilingual corpus of translated phrases and sentences – Building Elicitation corpora from feature structures – Feature Detection and Navigation • Transfer-rule Learning: apply ML-based methods to automatically acquire syntactic transfer rules for translation between the two languages – Learn from major language to minor language – Translate from minor language to major language • XFER + Decoder: – XFER engine produces a lattice of possible transferred structures at all levels – Decoder searches and selects the best scoring combination • Rule Refinement: refine the acquired rules via a process of interaction with bilingual informants • Morphology Learning • Word and Phrase bilingual lexicon acquisition May 11, 2006 Learning-based MT with Limited Resources 5
AVENUE Architecture Word-aligned elicited data English Language Model Learning Module Transfer Rules {PP, 4894} ; ; Score: 0. 0470 PP: : PP [NP POSTP] -> [PREP NP] ((X 2: : Y 1) (X 1: : Y 2)) Run Time Transfer System Lattice Word-to-Word Translation Probabilities Decoder Translation Lexicon May 11, 2006 Learning-based MT with Limited Resources 6
The Transfer Engine Analysis Transfer Source text is parsed A target language tree is into its grammatical created by reordering, structure. Determines insertion, and deletion. transfer application ordering. S Example: NP VP 他 看 书。(he read book) N he S NP VP N V NP 他 看书 May 11, 2006 V NP read DET N a book Generation Target language constraints are checked and final translation produced. E. g. “reads” is chosen over “read” to agree with “he”. Article “a” is inserted into Final translation: object NP. Source words “He reads a book” translated with transfer lexicon. Learning-based MT with Limited Resources 7
The Transfer Engine • Some Unique Features: – Works with either learned or manuallydeveloped transfer grammars – Handles rules with or without unification constraints – Supports interfacing with servers for Morphological analysis and generation – Can handle ambiguous source-word analyses and/or SL segmentations represented in the form of lattice structures May 11, 2006 Learning-based MT with Limited Resources 8
The Lattice Decoder • Simple Stack Decoder, similar in principle to SMT/EBMT decoders • Searches for best-scoring path of nonoverlapping lattice arcs • Scoring based on log-linear combination of scoring components (no MER training yet) • Scoring components: – Standard trigram LM – Fragmentation: how many arcs to cover the entire translation? – Length Penalty – Rule Scores (not fully integrated yet) May 11, 2006 Learning-based MT with Limited Resources 9
Outline • • • Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Example prototypes Implications for MT with vast parallel data Conclusions and future directions May 11, 2006 Learning-based MT with Limited Resources 10
Data Elicitation for Languages with Limited Resources • Rationale: – Large volumes of parallel text not available create a small maximally-diverse parallel corpus that directly supports the learning task – Bilingual native informant(s) can translate and align a small pre-designed elicitation corpus, using elicitation tool – Elicitation corpus designed to be typologically and structurally comprehensive and compositional – Transfer-rule engine and new learning approach support acquisition of generalized transfer-rules from the data May 11, 2006 Learning-based MT with Limited Resources 11
Elicitation Tool: English-Chinese Example May 11, 2006 Learning-based MT with Limited Resources 12
Elicitation Tool: English-Chinese Example May 11, 2006 Learning-based MT with Limited Resources 13
Elicitation Tool: English-Hindi Example May 11, 2006 Learning-based MT with Limited Resources 14
Elicitation Tool: English-Arabic Example May 11, 2006 Learning-based MT with Limited Resources 15
Elicitation Tool: Spanish-Mapudungun Example May 11, 2006 Learning-based MT with Limited Resources 16
Designing Elicitation Corpora • What do we want to elicit? – Diversity of linguistic phenomena and constructions – Syntactic structural diversity • How do we construct an elicitation corpus? – Typological Elicitation Corpus based on elicitation and documentation work of field linguists (e. g. Comrie 1977, Bouquiaux 1992): initial corpus size ~1000 examples – Structural Elicitation Corpus based on representative sample of English phrase structures: ~120 examples • Organized compositionally: elicit simple structures first, then use them as building blocks • Goal: minimize size, maximize linguistic coverage May 11, 2006 Learning-based MT with Limited Resources 17
Typological Elicitation Corpus • Feature Detection – Discover what features exist in the language and where/how they are marked • Example: does the language mark gender of nouns? How and where are these marked? – Method: compare translations of minimal pairs – sentences that differ in only ONE feature • Elicit translations/alignments for detected features and their combinations • Dynamic corpus navigation based on feature detection: no need to elicit for combinations involving non-existent features May 11, 2006 Learning-based MT with Limited Resources 18
Typological Elicitation Corpus • Initial typological corpus of about 1000 sentences was manually constructed • New construction methodology for building an elicitation corpus using: – A feature specification: lists inventory of available features and their values – A definition of the set of desired feature structures • Schemas define sets of desired combinations of features and values • Multiplier algorithm generates the comprehensive set of feature structures – A generation grammar and lexicon: NLG generator generates NL sentences from the feature structures May 11, 2006 Learning-based MT with Limited Resources 19
Structural Elicitation Corpus • Goal: create a compact diverse sample corpus of syntactic phrase structures in English in order to elicit how these map into the elicited language • Methodology: – Extracted all CFG “rules” from Brown section of Penn Tree. Bank (122 K sentences) – Simplified POS tag set – Constructed frequency histogram of extracted rules – Pulled out simplest phrases for most frequent rules for NPs, PPs, ADJPs, ADVPs, SBARs and Sentences – Some manual inspection and refinement • Resulting corpus of about 120 phrases/sentences representing common structures • See [Probst and Lavie, 2004] May 11, 2006 Learning-based MT with Limited Resources 20
Outline • • • Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Example prototypes Implications for MT with vast parallel data Conclusions and future directions May 11, 2006 Learning-based MT with Limited Resources 21
Transfer Rule Formalism ; SL: the old man, TL: ha-ish ha-zaqen Type information Part-of-speech/constituent information Alignments NP: : NP ( (X 1: : Y 1) (X 1: : Y 3) (X 2: : Y 4) (X 3: : Y 2) x-side constraints ((X 1 AGR) = *3 -SING) ((X 1 DEF = *DEF) ((X 3 AGR) = *3 -SING) ((X 3 COUNT) = +) y-side constraints ((Y 1 DEF) = *DEF) ((Y 3 DEF) = *DEF) ((Y 2 AGR) = *3 -SING) ((Y 2 GENDER) = (Y 4 GENDER)) ) xy-constraints, e. g. ((Y 1 AGR) = (X 1 AGR)) May 11, 2006 Learning-based MT with Limited Resources [DET ADJ N] -> [DET N DET ADJ] 22
Transfer Rule Formalism (II) ; SL: the old man, TL: ha-ish ha-zaqen NP: : NP ( (X 1: : Y 1) (X 1: : Y 3) (X 2: : Y 4) (X 3: : Y 2) Value constraints Agreement constraints May 11, 2006 [DET ADJ N] -> [DET N DET ADJ] ((X 1 AGR) = *3 -SING) ((X 1 DEF = *DEF) ((X 3 AGR) = *3 -SING) ((X 3 COUNT) = +) ((Y 1 DEF) = *DEF) ((Y 3 DEF) = *DEF) ((Y 2 AGR) = *3 -SING) ((Y 2 GENDER) = (Y 4 GENDER)) ) Learning-based MT with Limited Resources 23
Rule Learning - Overview • Goal: Acquire Syntactic Transfer Rules • Use available knowledge from the source side (grammatical structure) • Three steps: 1. Flat Seed Generation: first guesses at transfer rules; flat syntactic structure 2. Compositionality Learning: use previously learned rules to learn hierarchical structure 3. Constraint Learning: refine rules by learning appropriate feature constraints May 11, 2006 Learning-based MT with Limited Resources 24
Flat Seed Rule Generation Learning Example: NP Eng: the big apple Heb: ha-tapuax ha-gadol Generated Seed Rule: NP: : NP [ART ADJ N] [ART N ART ADJ] ((X 1: : Y 1) (X 1: : Y 3) (X 2: : Y 4) (X 3: : Y 2)) May 11, 2006 Learning-based MT with Limited Resources 25
Flat Seed Rule Generation • Create a “flat” transfer rule specific to the sentence pair, partially abstracted to POS – Words that are aligned word-to-word and have the same POS in both languages are generalized to their POS – Words that have complex alignments (or not the same POS) remain lexicalized • One seed rule for each translation example • No feature constraints associated with seed rules (but mark the example(s) from which it was learned) May 11, 2006 Learning-based MT with Limited Resources 26
Compositionality Learning Initial Flat Rules: S: : S [ART ADJ N V ART N] [ART N ART ADJ V P ART N] ((X 1: : Y 1) (X 1: : Y 3) (X 2: : Y 4) (X 3: : Y 2) (X 4: : Y 5) (X 5: : Y 7) (X 6: : Y 8)) NP: : NP [ART ADJ N] [ART N ART ADJ] ((X 1: : Y 1) (X 1: : Y 3) (X 2: : Y 4) (X 3: : Y 2)) NP: : NP [ART N] ((X 1: : Y 1) (X 2: : Y 2)) Generated Compositional Rule: S: : S [NP V NP] [NP V P NP] ((X 1: : Y 1) (X 2: : Y 2) (X 3: : Y 4)) May 11, 2006 Learning-based MT with Limited Resources 27
Compositionality Learning • Detection: traverse the c-structure of the English sentence, add compositional structure for translatable chunks • Generalization: adjust constituent sequences and alignments • Two implemented variants: – Safe Compositionality: there exists a transfer rule that correctly translates the sub-constituent – Maximal Compositionality: Generalize the rule if supported by the alignments, even in the absence of an existing transfer rule for the sub-constituent May 11, 2006 Learning-based MT with Limited Resources 28
Constraint Learning Input: Rules and their Example Sets S: : S [NP V NP] [NP V P NP] ((X 1: : Y 1) (X 2: : Y 2) (X 3: : Y 4)) {ex 1, ex 12, ex 17, ex 26} NP: : NP [ART ADJ N] [ART N ART ADJ] ((X 1: : Y 1) (X 1: : Y 3) (X 2: : Y 4) (X 3: : Y 2)) {ex 2, ex 3, ex 13} NP: : NP [ART N] ((X 1: : Y 1) (X 2: : Y 2)) {ex 4, ex 5, ex 6, ex 8, ex 10, ex 11} Output: Rules with Feature Constraints: S: : S [NP V NP] [NP V P NP] ((X 1: : Y 1) (X 2: : Y 2) (X 3: : Y 4) (X 1 NUM = X 2 NUM) (Y 1 NUM = Y 2 NUM) (X 1 NUM = Y 1 NUM)) May 11, 2006 Learning-based MT with Limited Resources 29
Constraint Learning • Goal: add appropriate feature constraints to the acquired rules • Methodology: – Preserve general structural transfer – Learn specific feature constraints from example set • Seed rules are grouped into clusters of similar transfer structure (type, constituent sequences, alignments) • Each cluster forms a version space: a partially ordered hypothesis space with a specific and a general boundary • The seed rules in a group form the specific boundary of a version space • The general boundary is the (implicit) transfer rule with the same type, constituent sequences, and alignments, but no feature constraints May 11, 2006 Learning-based MT with Limited Resources 30
Constraint Learning: Generalization • The partial order of the version space: Definition: A transfer rule tr 1 is strictly more general than another transfer rule tr 2 if all fstructures that are satisfied by tr 2 are also satisfied by tr 1. • Generalize rules by merging them: – Deletion of constraint – Raising two value constraints to an agreement constraint, e. g. ((x 1 num) = *pl), ((x 3 num) = *pl) ((x 1 num) = (x 3 num)) May 11, 2006 Learning-based MT with Limited Resources 31
Automated Rule Refinement • Bilingual informants can identify translation errors and pinpoint the errors • A sophisticated trace of the translation path can identify likely sources for the error and do “Blame Assignment” • Rule Refinement operators can be developed to modify the underlying translation grammar (and lexicon) based on characteristics of the error source: – Add or delete feature constraints from a rule – Bifurcate a rule into two rules (general and specific) – Add or correct lexical entries • See [Font-Llitjos, Carbonell & Lavie, 2005] May 11, 2006 Learning-based MT with Limited Resources 32
Outline • • • Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Example prototypes Implications for MT with vast parallel data Conclusions and future directions May 11, 2006 Learning-based MT with Limited Resources 33
AVENUE Prototypes • General XFER framework under development for past three years • Prototype systems so far: – – German-to-English, Dutch-to-English Chinese-to-English Hindi-to-English Hebrew-to-English – – Mapudungun-to-Spanish Quechua-to-Spanish Arabic-to-English Native-Brazilian languages to Brazilian Portuguese • In progress or planned: May 11, 2006 Learning-based MT with Limited Resources 34
Challenges for Hebrew MT • Paucity in existing language resources for Hebrew – No publicly available broad coverage morphological analyzer – No publicly available bilingual lexicons or dictionaries – No POS-tagged corpus or parse tree-bank corpus for Hebrew – No large Hebrew/English parallel corpus • Scenario well suited for CMU transfer-based MT framework for languages with limited resources May 11, 2006 Learning-based MT with Limited Resources 35
Hebrew-to-English MT Prototype • Initial prototype developed within a two month intensive effort • Accomplished: – – – Adapted available morphological analyzer Constructed a preliminary translation lexicon Translated and aligned Elicitation Corpus Learned XFER rules Developed (small) manual XFER grammar as a point of comparison – System debugging and development – Evaluated performance on unseen test data using automatic evaluation metrics May 11, 2006 Learning-based MT with Limited Resources 36
Source Input בשורה הבאה Transfer Rules {NP 1, 3} NP 1: : NP 1 [NP 1 "H" ADJ] -> [ADJ NP 1] ((X 3: : Y 1) (X 1: : Y 2) ((X 1 def) = +) ((X 1 status) =c absolute) ((X 1 num) = (X 3 num)) ((X 1 gen) = (X 3 gen)) (X 0 = X 1)) Preprocessing Morphology English Language Model Transfer Engine Translation Lexicon N: : N |: ["$WR"] -> ["BULL"] ((X 1: : Y 1) ((X 0 NUM) = s) ((Y 0 lex) = "BULL")) N: : N |: ["$WRH"] -> ["LINE"] ((X 1: : Y 1) ((X 0 NUM) = s) ((Y 0 lex) = "LINE")) May 11, 2006 Decoder Translation Output Lattice (0 1 "IN" @PREP) (1 1 "THE" @DET) (2 2 "LINE" @N) (1 2 "THE LINE" @NP) (0 2 "IN LINE" @PP) Learning-based MT@PP) with (0 4 "IN THE NEXT LINE" Limited Resources English Output in the next line 37
Morphology Example • Input word: B$WRH 0 1 2 3 4 |----B$WRH----| |-----B-----|$WR|--H--| |--B--|-H--|--$WRH---| May 11, 2006 Learning-based MT with Limited Resources 38
Morphology Example Y 0: ((SPANSTART 0) (SPANEND 4) (LEX B$WRH) (POS N) (GEN F) (NUM S) (STATUS ABSOLUTE)) Y 1: ((SPANSTART 0) (SPANEND 2) (LEX B) (POS PREP)) Y 2: ((SPANSTART 1) (SPANEND 3) (LEX $WR) (POS N) (GEN M) (NUM S) (STATUS ABSOLUTE)) Y 3: ((SPANSTART 3) (SPANEND 4) (LEX $LH) (POS POSS)) Y 4: ((SPANSTART 0) (SPANEND 1) (LEX B) (POS PREP)) Y 5: ((SPANSTART 1) (SPANEND 2) (LEX H) (POS DET)) Y 6: ((SPANSTART 2) (SPANEND 4) (LEX $WRH) (POS N) (GEN F) (NUM S) (STATUS ABSOLUTE)) Y 7: ((SPANSTART 0) (SPANEND 4) (LEX B$WRH) (POS LEX)) May 11, 2006 Learning-based MT with Limited Resources 39
Sample Output (dev-data) maxwell anurpung comes from ghana for israel four years ago and since worked in cleaning in hotels in eilat a few weeks ago announced if management club hotel that for him to leave israel according to the government instructions and immigration police in a letter in broken english which spread among the foreign workers thanks to them hotel for their hard work and announced that will purchase for hm flight tickets for their countries from their money May 11, 2006 Learning-based MT with Limited Resources 40
Evaluation Results • Test set of 62 sentences from Haaretz newspaper, 2 reference translations System BLEU NIST P R METEOR No Gram 0. 0616 3. 4109 0. 4090 0. 4427 0. 3298 Learned 0. 0774 3. 5451 0. 4189 0. 4488 0. 3478 Manual 0. 1026 3. 7789 0. 4334 0. 4474 0. 3617 May 11, 2006 Learning-based MT with Limited Resources 41
Hebrew-English: Test Suite Evaluation Grammar BLEU METEOR Baseline (No. Gram) 0. 0996 0. 4916 Learned Grammar 0. 1608 0. 5525 Manual Grammar 0. 1642 0. 5320 May 11, 2006 Learning-based MT with Limited Resources 42
Outline • • • Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Learning Morphology Example prototypes Implications for MT with vast parallel data Conclusions and future directions May 11, 2006 Learning-based MT with Limited Resources 43
Implications for MT with Vast Amounts of Parallel Data • Learning word/short-phrase translations vs. learning long phrase-to-phrase translations • Phrase-to-phrase MT ill suited for long-range reorderings ungrammatical output • Recent work on hierarchical Stat-MT [Chiang, 2005] and parsing-based MT [Melamed et al, 2005] • Learning general tree-to-tree syntactic mappings is equally problematic: – Meaning is a hybrid of complex, non-compositional phrases embedded within a syntactic structure – Some constituents can be translated in isolation, others require contextual mappings May 11, 2006 Learning-based MT with Limited Resources 44
Implications for MT with Vast Amounts of Parallel Data • Our approach for learning transfer rules is applicable to the large data scenario, subject to solutions for several challenges: – No elicitation corpus break-down parallel sentences into reasonable learning examples – Working with less reliable automatic word alignments rather than manual alignments – Effective use of reliable parse structures for ONE language (i. e. English) and automatic word alignments in order to decompose the translation of a sentence into several compositional rules. – Effective scoring of resulting very large transfer grammars, and scaled up transfer + decoding May 11, 2006 Learning-based MT with Limited Resources 45
Implications for MT with Vast Amounts of Parallel Data • Example: 他 �常 与 江�民 �� 通 �� He freq with J Zemin Pres via phone He freq talked with President J Zemin over the phone May 11, 2006 Learning-based MT with Limited Resources 46
Implications for MT with Vast Amounts of Parallel Data • Example: 他 �常 与 江�民 �� 通 �� NP 1 He freq NP 2 Pres with J Zemin NP 3 via phone He freq talked with President J Zemin over the NP 1 NP 2 NP 3 phone May 11, 2006 Learning-based MT with Limited Resources 47
Conclusions • There is hope yet for wide-spread MT between many of the worlds language pairs • MT offers a fertile yet extremely challenging ground for learning-based approaches that leverage from diverse sources of information: – – Syntactic structure of one or both languages Word-to-word correspondences Decomposable units of translation Statistical Language Models • Provides a feasible solution to MT for languages with limited resources • Extremely promising approach for addressing the fundamental weaknesses in current corpus-based MT for languages with vast resources May 11, 2006 Learning-based MT with Limited Resources 48
Future Research Directions • Automatic Transfer Rule Learning: – In the “large-data” scenario: from large volumes of uncontrolled parallel text automatically word-aligned – In the absence of morphology or POS annotated lexica – Learning mappings for non-compositional structures – Effective models for rule scoring for • Decoding: using scores at runtime • Pruning the large collections of learned rules – Learning Unification Constraints • Integrated Xfer Engine and Decoder – Improved models for scoring tree-to-tree mappings, integration with LM and other knowledge sources in the course of the search May 11, 2006 Learning-based MT with Limited Resources 49
Future Research Directions • Automatic Rule Refinement • Morphology Learning • Feature Detection and Corpus Navigation • … May 11, 2006 Learning-based MT with Limited Resources 50
May 11, 2006 Learning-based MT with Limited Resources 51
Mapudungun-to-Spanish Example English I didn’t see Maria Mapudungun pelafiñ Maria Spanish No vi a María May 11, 2006 Learning-based MT with Limited Resources 52
Mapudungun-to-Spanish Example English I didn’t see Maria Mapudungun pelafiñ Maria pe -la -fi -ñ Maria see -neg -3. obj -1. subj. indicative Maria Spanish No vi a María neg see. 1. subj. past. indicative acc Maria May 11, 2006 Learning-based MT with Limited Resources 53
pe-la-fi-ñ Maria V pe May 11, 2006 Learning-based MT with Limited Resources 54
pe-la-fi-ñ Maria V pe VSuff Negation = + la May 11, 2006 Learning-based MT with Limited Resources 55
pe-la-fi-ñ Maria V pe VSuff. G Pass all features up VSuff la May 11, 2006 Learning-based MT with Limited Resources 56
pe-la-fi-ñ Maria V pe VSuff. G VSuff object person = 3 fi la May 11, 2006 Learning-based MT with Limited Resources 57
pe-la-fi-ñ Maria V pe VSuff. G VSuff Pass all features up from both children fi la May 11, 2006 Learning-based MT with Limited Resources 58
pe-la-fi-ñ Maria V pe VSuff. G VSuff ñ VSuff person = 1 number = sg mood = ind fi la May 11, 2006 Learning-based MT with Limited Resources 59
pe-la-fi-ñ Maria VSuff. G V pe VSuff. G VSuff ñ VSuff Pass all features up from both children fi la May 11, 2006 Learning-based MT with Limited Resources 60
pe-la-fi-ñ Maria Pass all features up from both children V V pe Check that: 1) negation = + VSuff. G 2) tense is undefined VSuff. G VSuff ñ fi la May 11, 2006 Learning-based MT with Limited Resources 61
pe-la-fi-ñ Maria NP V pe N VSuff. G VSuff ñ Maria VSuff person = 3 number = sg human = + fi la May 11, 2006 Learning-based MT with Limited Resources 62
pe-la-fi-ñ Maria S Pass features up from V Check that NP is human = + VP NP V pe N VSuff. G VSuff ñ Maria VSuff fi la May 11, 2006 Learning-based MT with Limited Resources 63
Transfer to Spanish: Top-Down S S VP VP NP V pe N VSuff. G VSuff ñ Maria VSuff fi la May 11, 2006 Learning-based MT with Limited Resources 64
Transfer to Spanish: Top-Down Pass all features to Spanish side S S VP VP NP V pe VSuff. G VSuff N VSuff. G VSuff ñ Maria VSuff “a” NP N VSuff. G V V fi la May 11, 2006 Learning-based MT with Limited Resources 65
Transfer to Spanish: Top-Down S VP NP V pe VSuff. G VSuff N VSuff. G VSuff ñ Maria VSuff VP V “a” NP N VSuff. G V S Pass all features down fi la May 11, 2006 Learning-based MT with Limited Resources 66
Transfer to Spanish: Top-Down S S VP VP NP V pe VSuff. G VSuff N VSuff. G VSuff ñ Maria VSuff “a” NP N VSuff. G V V Pass object features down fi la May 11, 2006 Learning-based MT with Limited Resources 67
Transfer to Spanish: Top-Down S S VP VP NP V pe VSuff. G VSuff N VSuff. G VSuff ñ Maria VSuff “a” NP N VSuff. G V V fi Accusative marker on objects is introduced because human = + la May 11, 2006 Learning-based MT with Limited Resources 68
Transfer to Spanish: Top-Down S S VP VP: : VP [VBar NP] -> [VBar "a" NP] VP ( (X 1: : Y 1) NP V (X 2: : Y 3) “a” NP N = (*NOT* personal)) ((X 2 type) ((X 2 human) =c +) VSuff (X 0 = N X 1) ((X 0 object) = X 2) VSuff. G V pe V VSuff. G VSuff la May 11, 2006 fi ñ (Y 0 Maria = X 0) ((Y 0 object) = (X 0 object)) (Y 1 = Y 0) (Y 3 = (Y 0 object)) ((Y 1 objmarker person) = (Y 3 person)) ((Y 1 objmarker number) = (Y 3 number)) ((Y 1 objmarker gender) = (Y 3 ender))) Learning-based MT with Limited Resources 69
Transfer to Spanish: Top-Down S VP NP V pe N VSuff. G VSuff ñ Maria VSuff S Pass person, number, and. VP mood features to Spanish Verb “a” NP V “no” V Assign tense = past fi la May 11, 2006 Learning-based MT with Limited Resources 70
Transfer to Spanish: Top-Down S S VP VP NP V pe N VSuff. G VSuff ñ Maria VSuff “a” V NP “no” V Introduced because negation = + fi la May 11, 2006 Learning-based MT with Limited Resources 71
Transfer to Spanish: Top-Down S S VP VP NP V pe N VSuff. G VSuff ñ Maria VSuff “a” V “no” NP V ver fi la May 11, 2006 Learning-based MT with Limited Resources 72
Transfer to Spanish: Top-Down S S VP VP NP V pe N VSuff. G VSuff ñ Maria VSuff “no” fi la May 11, 2006 “a” V Learning-based MT with Limited Resources NP V ver vi person = 1 number = sg mood = indicative tense = past 73
Transfer to Spanish: Top-Down S S Pass features over to VP Spanish side VP NP V pe N VSuff. G VSuff ñ Maria VSuff “a” V “no” NP V N vi N María fi la May 11, 2006 Learning-based MT with Limited Resources 74
I Didn’t see Maria S S VP VP NP V pe N VSuff. G VSuff ñ Maria VSuff “a” V “no” NP V N vi N María fi la May 11, 2006 Learning-based MT with Limited Resources 75
May 11, 2006 Learning-based MT with Limited Resources 76
- Slides: 76