CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini

  • Slides: 59
Download presentation
CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini 6/17/2021 CPSC 503 Winter 2014 1

CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini 6/17/2021 CPSC 503 Winter 2014 1

Lexical Dependencies: Problem (b) 6/17/2021 CPSC 503 Winter 2014 2

Lexical Dependencies: Problem (b) 6/17/2021 CPSC 503 Winter 2014 2

Lexical Dependencies: Problem Two parse trees for the sentence “Moscow sent troops into Afghanistan”

Lexical Dependencies: Problem Two parse trees for the sentence “Moscow sent troops into Afghanistan” (b) (a) VP-attachment NP-attachment Typically NP-attachment more frequent than VP-attachment 6/17/2021 CPSC 503 Winter 2014 3

Lexical Dependencies: Solution • Add lexical dependencies to the scheme… – Infiltrate the influence

Lexical Dependencies: Solution • Add lexical dependencies to the scheme… – Infiltrate the influence of particular words into the probabilities in the derivation – I. e. Condition on the actual words in the right way All the words? (a) – P(VP -> V NP PP | VP = “sent troops into Afg. ”) (b) – P(VP -> V NP 6/17/2021 CPSC 503 Winter 2014 4

Example (right) (Collins 1999) Attribute grammar: each non-terminal is annotated with its lexical head…

Example (right) (Collins 1999) Attribute grammar: each non-terminal is annotated with its lexical head… many more rules! 6/17/2021 CPSC 503 Winter 2014 6

More specific rules • We used to have rule r – VP -> V

More specific rules • We used to have rule r – VP -> V NP PP P(r|VP) • That’s the count of this rule divided by the number of VPs in a treebank • Now we have rule r – VP(h(VP))-> V(h(VP)) NP(h(NP)) PP(h(PP)) – P(r|VP, h(VP), h(NP), h(PP)) Sample sentence: “Workers dumped sacks into the bin” – VP(dumped)-> V(dumped) NP(sacks) PP(into) – P(r|VP, dumped is the verb, sacks is the head of the NP, into is the head of the PP) 6/17/2021 CPSC 503 Winter 2014 7

Problem with more specific rules Rule: – VP(dumped)-> V(dumped) NP(sacks) PP(into) – P(r|VP, dumped

Problem with more specific rules Rule: – VP(dumped)-> V(dumped) NP(sacks) PP(into) – P(r|VP, dumped is the verb, sacks is the head of the NP, into is the head of the PP) Not likely to have significant counts in any treebank! 6/17/2021 CPSC 503 Winter 2014 9

Usual trick: Assume Independence • When stuck, exploit independence and collect the statistics you

Usual trick: Assume Independence • When stuck, exploit independence and collect the statistics you can… We’ll capture two aspects: – Verb subcategorization • Particular verbs have affinities for particular VP expansions – Affinities between heads • Some phrase/heads fit better with some predicates heads than others 6/17/2021 CPSC 503 Winter 2014 10

Subcategorization • Condition particular VP rules only on their head… so r: VP(h(VP))-> V(h(VP))

Subcategorization • Condition particular VP rules only on their head… so r: VP(h(VP))-> V(h(VP)) NP(h(NP)) PP(h(PP)) P(r|VP, h(VP), h(NP), h(PP)) Becomes P(r | VP, h(VP)) x …… e. g. , P(r | VP, dumped) What’s the count? How many times was this rule used with dumped, divided by the number of VPs that dumped appears in total 6/17/2021 CPSC 503 Winter 2014 11

Phrase/heads affinities for their Predicates r: VP -> V NP PP ; P(r|VP, h(VP),

Phrase/heads affinities for their Predicates r: VP -> V NP PP ; P(r|VP, h(VP), h(NP), h(PP)) Becomes P(r | VP, h(VP)) x P(h(NP) | NP, h(VP))) x P(h(PP) | PP, h(VP))) E. g. P(r | VP, dumped) x P(sacks | NP, dumped)) x P(into | PP, dumped)) • count the places where dumped is the head of a constituent that has a PP daughter with into as its head and normalize 6/17/2021 CPSC 503 Winter 2014 12

Example (right) P(VP 6/17/2021 -> V NP PP | VP, dumped) =. 67 CPSC

Example (right) P(VP 6/17/2021 -> V NP PP | VP, dumped) =. 67 CPSC 503 Winter 2014 P(into | PP, dumped)=. 22 13

Example (wrong) P(VP -> V NP | VP, dumped)=. . P(into 6/17/2021 CPSC 503

Example (wrong) P(VP -> V NP | VP, dumped)=. . P(into 6/17/2021 CPSC 503 Winter 2014 | PP, sacks)=. . 14

PCFG Parsing State of the art 6/17/2021 CPSC 503 Winter 2014 From C. Manning

PCFG Parsing State of the art 6/17/2021 CPSC 503 Winter 2014 From C. Manning (Stanford NLP) 15

Knowledge-Formalisms Map (including probabilistic formalisms) Morphology State Machines (and prob. versions) (Finite State Automata,

Knowledge-Formalisms Map (including probabilistic formalisms) Morphology State Machines (and prob. versions) (Finite State Automata, Finite State Transducers, Markov Models) Syntax Semantics Pragmatics Discourse and Dialogue 6/17/2021 Rule systems (and prob. versions) (e. g. , (Prob. ) Context-Free Grammars) Logical formalisms (First-Order Logics) AI planner(MDP Markov Decision Processes) CPSC 503 Winter 2014 16

Next three classes • What meaning is and how to represent it • Semantic

Next three classes • What meaning is and how to represent it • Semantic Analysis: How to map sentences into their meaning – Complete mapping still impractical – “Shallow” version: Semantic Role Labeling • Meaning of individual words (lexical semantics) • Computational Lexical Semantics Tasks – Word sense disambiguation – Word Similarity 6/17/2021 CPSC 503 Winter 2014 17

Today Oct 7 • Semantics / Meaning /Meaning Representations • Linguistically relevant Concepts in

Today Oct 7 • Semantics / Meaning /Meaning Representations • Linguistically relevant Concepts in FOPC/FOL • Semantic Analysis 6/17/2021 CPSC 503 Winter 2014 18

Semantics Def. Semantics: The study of the meaning of words, intermediate constituents and sentences

Semantics Def. Semantics: The study of the meaning of words, intermediate constituents and sentences Def 1. Meaning: a representation that links the linguistic input to knowledge of the world Def 2. Meaning: a representation that expresses the linguistic input in terms of objects, actions, events, time, space… beliefs, attitudes. . . relationships Language independent 6/17/2021 CPSC 503 Winter 2014 19

Semantic Relations involving Sentences Same truth Paraphrase: have the same meaning conditions • I

Semantic Relations involving Sentences Same truth Paraphrase: have the same meaning conditions • I gave the apple to John vs. I gave John the apple • I bought a car from you vs. you sold a car to me • The thief was chased by the police vs. …… Entailment: “implication” • The park rangers killed the bear vs. The bear is dead • Nemo is a fish vs. Nemo is an animal Contradiction: I am in Vancouver vs. I am in India 6/17/2021 CPSC 503 Winter 2014 20

Meaning Structure of Language • How does language convey meaning? – Grammaticization – Display

Meaning Structure of Language • How does language convey meaning? – Grammaticization – Display a basic predicate-argument structure (e. g. , verb complements) – Display a partially compositional semantics – Words 6/17/2021 CPSC 503 Winter 2014 21

Grammaticization Concept • • Past More than one Again Negation • • Affix -ed

Grammaticization Concept • • Past More than one Again Negation • • Affix -ed -s rein-, un-, de- Words from Nonlexical categories • Obligation • Possibility • Definite, Specific • Indefinite, Non-specific • Disjunction • Negation • Conjunction CPSC 503 Winter 2014 6/17/2021 • • must may the a or not and 22

Predicate-Argument Structure • Represent relationships among concepts • Some words act like arguments and

Predicate-Argument Structure • Represent relationships among concepts • Some words act like arguments and some words act like predicates: – Nouns as concepts or arguments: red(ball) – Adj, Adv, Verbs as predicates: red(ball) • Sub-categorization frames for verbs specify number, position, and syntactic category of arguments • Examples: give NP 1 NP 2, find NP, sneeze [] 6/17/2021 CPSC 503 Winter 2014 23

Semantic (Thematic) Roles This can be extended to the realm of semantics • Semantic

Semantic (Thematic) Roles This can be extended to the realm of semantics • Semantic Roles: Participants in an event – Agent: George hit Bill was hit by George – Theme: George hit Bill was hit by George Source, Goal, Instrument, Force… Arguments in surface structure can be linked with their semantic roles 6/17/2021 • Mary gave/sent/read a book to Ming Agent Theme Goal • Mary gave/sent/read Ming a book Agent Goal Theme CPSC 503 Winter 2014 24

Requirements for Meaning Representations 6/17/2021 CPSC 503 Winter 2014 25

Requirements for Meaning Representations 6/17/2021 CPSC 503 Winter 2014 25

First Order Predicate Calculus (FOPC) • FOPC provides sound computational basis for verifiability, inference,

First Order Predicate Calculus (FOPC) • FOPC provides sound computational basis for verifiability, inference, expressiveness… – – – Supports determination of truth Supports Canonical Form Supports question-answering (via variables) Supports inference Argument-Predicate structure Supports compositionality of meaning 6/17/2021 CPSC 503 Winter 2014 26

Common Meaning Representations I have a car FOPC Semantic Nets Common foundation: structures composed

Common Meaning Representations I have a car FOPC Semantic Nets Common foundation: structures composed of symbols that correspond to objects and 6/17/2021 CPSC 503 Winter 2014 relationships Frames 27

Today Oct 7 • Semantics / Meaning /Meaning Representations • Linguistically relevant Concepts in

Today Oct 7 • Semantics / Meaning /Meaning Representations • Linguistically relevant Concepts in FOPC/FOL • Semantic Analysis 6/17/2021 CPSC 503 Winter 2014 28

Categories & Events • Categories: – Vegetarian. Restaurant (Joe’s) - relation vs. object –

Categories & Events • Categories: – Vegetarian. Restaurant (Joe’s) - relation vs. object – Most. Popular(Joe’s, Vegetarian. Restaurant) – ISA (Joe’s, Vegetarian. Restaurant) Reification – AKO (Vegetarian. Restaurant, Restaurant) • Events: can be described in NL with different numbers of arguments… – – – – I I I I ate ate a turkey sandwich at my desk lunch a turkey sandwich for lunch at my desk CPSC 422, Lecture 22 29

Reification Again “I ate a turkey sandwich for lunch” $ w: Isa(w, Eating) Ù

Reification Again “I ate a turkey sandwich for lunch” $ w: Isa(w, Eating) Ù Eater(w, Speaker) Ù Eaten(w, Turkey. Sandwich) Ù Meal. Eaten(w, Lunch) • Reification Advantage: – No need to specify fixed number of arguments to represent a given sentence in NL CPSC 422, Lecture 22 30

MUC-4 Example On October 30, 1989, one civilian was killed in a reported FMLN

MUC-4 Example On October 30, 1989, one civilian was killed in a reported FMLN attack in El Salvador. INCIDENT: DATE 30 OCT 89 INCIDENT: LOCATION EL SALVADOR INCIDENT: TYPE ATTACK INCIDENT: STAGE OF EXECUTION ACCOMPLISHED INCIDENT: INSTRUMENT ID INCIDENT: INSTRUMENT TYPE PERP: INCIDENT CATEGORY TERRORIST ACT PERP: INDIVIDUAL ID "TERRORIST" PERP: ORGANIZATION ID "THE FMLN" PERP: ORG. CONFIDENCE REPORTED: "THE FMLN" PHYS TGT: ID PHYS TGT: TYPE PHYS TGT: NUMBER PHYS TGT: FOREIGN NATION PHYS TGT: EFFECT OF INCIDENT PHYS TGT: TOTAL NUMBER HUM TGT: NAME HUM TGT: DESCRIPTION "1 CIVILIAN" HUM TGT: TYPE CIVILIAN: "1 CIVILIAN" HUM TGT: NUMBER 1: "1 CIVILIAN" HUM TGT: FOREIGN NATION HUM TGT: EFFECT OF INCIDENT DEATH: "1 CIVILIAN" HUM TGT: TOTAL NUMBER CPSC 422, Lecture 22 31

Representing Time • Events are associated with points or intervals in time. • We

Representing Time • Events are associated with points or intervals in time. • We can impose an ordering on distinct events using the notion of precedes. • Temporal logic notation: ($w, x, t) Arrive(w, x, t) • Constraints on variable t I arrived in New York ($ t) Arrive(I, New. York, t) Ù precedes(t, Now) CPSC 422, Lecture 22 32

Interval Events • Need tstart and tend “She was driving to New York until

Interval Events • Need tstart and tend “She was driving to New York until now” $ tstart, tend , e, i ISA(e, Drive) Driver(e, She) Dest(e, New. York) Ù Interval. Of(e, i) Endpoint(i, tend) Startpoint(i, tstart) Precedes(tstart, Now) Ù Equals(tend, Now) CPSC 422, Lecture 22 33

Relation Between Tenses and Time Relation between simple verb tenses and points in time

Relation Between Tenses and Time Relation between simple verb tenses and points in time is not straightforward • Present tense used like future: – We fly from Baltimore to Boston at 10 • Complex tenses: – Flight 1902 arrived late – Flight 1902 had arrived late Representing them in the same way seems wrong…. 6/17/2021 CPSC 503 Winter 2014 34

Reference Point • Reichenbach (1947) introduced notion of Reference point (R), separated out from

Reference Point • Reichenbach (1947) introduced notion of Reference point (R), separated out from Utterance time (U) and Event time (E) • Example: – When Mary's flight departed, I ate lunch – When Mary's flight departed, I had eaten lunch • Departure event specifies reference point. 6/17/2021 CPSC 503 Winter 2014 35

Today Oct 7 • Semantics / Meaning /Meaning Representations • Linguistically relevant Concepts in

Today Oct 7 • Semantics / Meaning /Meaning Representations • Linguistically relevant Concepts in FOPC / FOL • Semantic Analysis 6/17/2021 CPSC 503 Winter 2014 36

Practical Goal for (Syntax-driven) Semantic Analysis Map NL queries into FOPC so that answers

Practical Goal for (Syntax-driven) Semantic Analysis Map NL queries into FOPC so that answers can be effectively computed • What African countries are not on the Mediterranean Sea? • Was 2007 the first El Nino year after 2001? 6/17/2021 CPSC 503 Winter 2014 37

Semantic Analysis Meanings of grammatical structures Meanings of words Common-Sense Domain knowledge Discourse Structure

Semantic Analysis Meanings of grammatical structures Meanings of words Common-Sense Domain knowledge Discourse Structure Context 6/17/2021 Shall we meet on Tue? What time is it? Sentence I am going to SFU on Tue The garbage truck just left Syntax-driven Semantic Analysis Literal Meaning Further Analysis Intended meaning CPSC 503 Winter 2014 I N F E R E N C E 38

Compositional Analysis • Principle of Compositionality – The meaning of a whole is derived

Compositional Analysis • Principle of Compositionality – The meaning of a whole is derived from the meanings of the parts • What parts? – The constituents of the syntactic parse of the input 6/17/2021 CPSC 503 Winter 2014 39

Compositional Analysis: Example • Ay. Caramba serves meat 6/17/2021 CPSC 503 Winter 2014 40

Compositional Analysis: Example • Ay. Caramba serves meat 6/17/2021 CPSC 503 Winter 2014 40

Augmented Rules • Augment each syntactic CFG rule with a semantic formation rule •

Augmented Rules • Augment each syntactic CFG rule with a semantic formation rule • Abstractly • i. e. , The semantics of A can be computed from some function applied to the semantics of its parts. • The class of actions performed by f will be quite restricted. 6/17/2021 CPSC 503 Winter 2014 41

Simple Extension of FOL: Lambda Forms – A FOL sentence with variables in it

Simple Extension of FOL: Lambda Forms – A FOL sentence with variables in it that are to be bound. – Lambda-reduction: variables are bound by treating the lambda form as a function with formal arguments 6/17/2021 CPSC 503 Winter 2014 42

Augmented Rules: Example • Concrete entities assigning FOL constants • Attachments {Ay. Caramba} –

Augmented Rules: Example • Concrete entities assigning FOL constants • Attachments {Ay. Caramba} – Prop. Noun -> Ay. Caramba {MEAT} – Mass. Noun -> meat • Simple non-terminals copying from daughters – NP -> Prop. Noun – NP -> Mass. Noun 6/17/2021 up to mothers. • Attachments {Prop. Noun. sem} {Mass. Noun. sem} CPSC 503 Winter 2014 43

Augmented Rules: Example Semantics attached to one daughter is applied to semantics of the

Augmented Rules: Example Semantics attached to one daughter is applied to semantics of the other daughter(s). • S -> NP VP • VP -> Verb NP • {VP. sem(NP. sem)} • {Verb. sem(NP. sem) lambda-form • Verb -> serves 6/17/2021 CPSC 503 Winter 2014 44

Example y AC y MEAT AC • • ……. MEAT S -> NP VP

Example y AC y MEAT AC • • ……. MEAT S -> NP VP VP -> Verb NP Verb -> serves NP -> Prop. Noun NP -> Mass. Noun Prop. Noun -> Ay. Caramba Mass. Noun -> meat 6/17/2021 • {VP. sem(NP. sem)} • {Verb. sem(NP. sem) • {Prop. Noun. sem} • {Mass. Noun. sem} • {AC} CPSC 503 Winter 2014 • {MEAT} 45

References (Project? ) • Text Book: Representation and Inference for Natural Language : A

References (Project? ) • Text Book: Representation and Inference for Natural Language : A First Course in Computational Semantics Patrick Blackburn and Johan Bos, 2005, CSLI • J. Bos (2011): A Survey of Computational Semantics: Representation, Inference and Knowledge in Wide-Coverage Text Understanding. Language and Linguistics Compass 5(6): 336– 366. Next Time • Read Chp. 19 (Lexical Semantics) 6/17/2021 CPSC 503 Winter 2014 46

Non-Compositionality • Unfortunately, there are lots of examples where the meaning of a constituent

Non-Compositionality • Unfortunately, there are lots of examples where the meaning of a constituent can’t be derived from the meanings of the parts - metaphor, (e. g. , corporation as person) – metonymy, (? ? ) – idioms, – irony, – sarcasm, – indirect requests, etc 6/17/2021 CPSC 503 Winter 2014 47

English Idioms • Lots of these… constructions where the meaning of the whole is

English Idioms • Lots of these… constructions where the meaning of the whole is either – Totally unrelated to the meanings of the parts (“kick the bucket”) – Related in some opaque way (“run the show”) • • 6/17/2021 “buy the farm” “bite the bullet” “bury the hatchet” etc… CPSC 503 Winter 2014 48

The Tip of the Iceberg – “Enron is the tip of the iceberg. ”

The Tip of the Iceberg – “Enron is the tip of the iceberg. ” NP -> “the tip of the iceberg” {…. } – “the tip of an old iceberg” – “the tip of a 1000 -page iceberg” – “the merest tip of the iceberg” NP -> Tip. NP of Iceberg. NP {…} Tip. NP: NP with tip as its head Iceberg. NP NP with iceberg as its head 6/17/2021 CPSC 503 Winter 2014 49

Handling Idioms – Mixing lexical items and grammatical constituents – Introduction of idiom-specific constituents

Handling Idioms – Mixing lexical items and grammatical constituents – Introduction of idiom-specific constituents – Permit semantic attachments that introduce predicates unrelated with constituents NP -> Tip. NP of Iceberg. NP {small-part(), beginning()…. } Tip. NP: NP with tip as its head Iceberg. NP NP with iceberg as its head 6/17/2021 CPSC 503 Winter 2014 50

Attachments for a fragment of English (Sect. 18. 5) • • old edition Sentences

Attachments for a fragment of English (Sect. 18. 5) • • old edition Sentences Noun-phrases Verb-phrases Prepositional-phrases Based on “The core Language Engine” 1992 6/17/2021 CPSC 503 Winter 2014 51

Full story more complex • To deal properly with quantifiers – Permit lambda-variables to

Full story more complex • To deal properly with quantifiers – Permit lambda-variables to range over predicates. E. g. , – Introduce complex terms to remain agnostic about final scoping 6/17/2021 CPSC 503 Winter 2014 52

Solution: Quantifier Scope Ambiguity • Similarly to PP attachment, number of possible interpretations exponential

Solution: Quantifier Scope Ambiguity • Similarly to PP attachment, number of possible interpretations exponential in the number of complex terms • Weak methods to prefer one interpretation over another: • likelihood of different orderings • Mirror surface ordering • Domain specific knowledge 6/17/2021 CPSC 503 Winter 2014 53

Integration with a Parser • Assume you’re using a dynamic-programming style parser (Earley or

Integration with a Parser • Assume you’re using a dynamic-programming style parser (Earley or CKY). • Two basic approaches – Integrate semantic analysis into the parser (assign meaning representations as constituents are completed) – Pipeline… assign meaning representations to complete trees only after they’re completed 6/17/2021 CPSC 503 Winter 2014 54

Pros and Cons • Integration – use semantic constraints to cut off parses that

Pros and Cons • Integration – use semantic constraints to cut off parses that make no sense – assign meaning representations to constituents that don’t take part in any correct parse • Pipeline – assign meaning representations only to constituents that take part in a correct parse – parser needs to generate all correct parses 6/17/2021 CPSC 503 Winter 2014 55

Linguistically Relevant Concepts in FOPC • • • Categories & Events (Reification) Representing Time

Linguistically Relevant Concepts in FOPC • • • Categories & Events (Reification) Representing Time Beliefs (optional, read if relevant to your project) Aspects (optional, read if relevant to your project) Description Logics (optional, read if relevant to your project) 6/17/2021 CPSC 503 Winter 2014 58

Categories & Events • Categories: – Vegetarian. Restaurant (Joe’s) - relation vs. object –

Categories & Events • Categories: – Vegetarian. Restaurant (Joe’s) - relation vs. object – Most. Popular(Joe’s, Vegetarian. Restaurant) – ISA (Joe’s, Vegetarian. Restaurant) Reification – AKO (Vegetarian. Restaurant, Restaurant) • Events: can be described in NL with different numbers of arguments… – I ate – I ate 6/17/2021 a turkey sandwich at my desk lunch a turkey sandwich for lunch at my desk CPSC 503 Winter 2014 59

MUC-4 Example On October 30, 1989, one civilian was killed in a reported FMLN

MUC-4 Example On October 30, 1989, one civilian was killed in a reported FMLN attack in El Salvador. INCIDENT: DATE 30 OCT 89 INCIDENT: LOCATION EL SALVADOR INCIDENT: TYPE ATTACK INCIDENT: STAGE OF EXECUTION ACCOMPLISHED INCIDENT: INSTRUMENT ID INCIDENT: INSTRUMENT TYPE PERP: INCIDENT CATEGORY TERRORIST ACT PERP: INDIVIDUAL ID "TERRORIST" PERP: ORGANIZATION ID "THE FMLN" PERP: ORG. CONFIDENCE REPORTED: "THE FMLN" PHYS TGT: ID PHYS TGT: TYPE PHYS TGT: NUMBER PHYS TGT: FOREIGN NATION PHYS TGT: EFFECT OF INCIDENT PHYS TGT: TOTAL NUMBER HUM TGT: NAME HUM TGT: DESCRIPTION "1 CIVILIAN" HUM TGT: TYPE CIVILIAN: "1 CIVILIAN" HUM TGT: NUMBER 1: "1 CIVILIAN" HUM TGT: FOREIGN NATION HUM TGT: EFFECT OF INCIDENT DEATH: "1 CIVILIAN" HUM TGT: TOTAL NUMBER 6/17/2021 CPSC 503 Winter 2014 60

Reification Again “I ate a turkey sandwich for lunch” $ w: Isa(w, Eating) Ù

Reification Again “I ate a turkey sandwich for lunch” $ w: Isa(w, Eating) Ù Eater(w, Speaker) Ù Eaten(w, Turkey. Sandwich) Ù Meal. Eaten(w, Lunch) • Reification Advantages: – No need to specify fixed number of arguments to represent a given sentence – You can easily specify inference rules involving the arguments 6/17/2021 CPSC 503 Winter 2014 61

Representing Time • Events are associated with points or intervals in time. • We

Representing Time • Events are associated with points or intervals in time. • We can impose an ordering on distinct events using the notion of precedes. • Temporal logic notation: ($w, x, t) Arrive(w, x, t) • Constraints on variable t I arrived in New York ($ t) Arrive(I, New. York, t) Ù precedes(t, Now) 6/17/2021 CPSC 503 Winter 2014 62

Interval Events • Need tstart and tend “She was driving to New York until

Interval Events • Need tstart and tend “She was driving to New York until now” $ tstart, tend , e, i ISA(e, Drive) Driver(e, She) Dest(e, New. York) Ù Interval. Of(e, i) Endpoint(i, tend) Startpoint(i, tstart) Precedes(tstart, Now) Ù Equals(tend, Now) 6/17/2021 CPSC 503 Winter 2014 63