Common Sense Reasoning Common Sense Inference l Lets

  • Slides: 61
Download presentation
Common Sense Reasoning

Common Sense Reasoning

Common Sense Inference l Let’s distinguish between: – Mathematical inference about common sense situations

Common Sense Inference l Let’s distinguish between: – Mathematical inference about common sense situations l – Example: Formalize theory of behavior of liquids Inference with common sense knowledge l Not too much about this yet

What is (mathematical) inference? Set of axioms (true assertions about the world) l Inference

What is (mathematical) inference? Set of axioms (true assertions about the world) l Inference engine (set of IF-THEN inference rules) that allows you to l – – Deduce new assertions from the old (forward chaining) Determine whether a given assertion is true (backward chaining)

Classic example Birds can fly. l Tweety is a bird. l Therefore… Tweety can

Classic example Birds can fly. l Tweety is a bird. l Therefore… Tweety can fly. l

Not-so-classic example l Cheap apartments are rare.

Not-so-classic example l Cheap apartments are rare.

Not-so-classic example Cheap apartments are rare. l Rare things are expensive. l

Not-so-classic example Cheap apartments are rare. l Rare things are expensive. l

Not-so-classic example Cheap apartments are rare. l Rare things are expensive. l Therefore… Cheap

Not-so-classic example Cheap apartments are rare. l Rare things are expensive. l Therefore… Cheap apartments are expensive. l l So, exactly what was wrong with that? ?

Common sense inference vs. Mathematical inference l Mathematical inference = Exact definitions + Universally

Common sense inference vs. Mathematical inference l Mathematical inference = Exact definitions + Universally true statements + Complete reasoning + Depth-first exploration + Batch processing

Common sense inference vs. Mathematical inference l Common sense inference = Imprecise definitions +

Common sense inference vs. Mathematical inference l Common sense inference = Imprecise definitions + Contingent statements + Incomplete reasoning + Breadth-first exploration + Incremental processing

Imprecise Definitions Mathematical inference assumes airtight definitions l Common sense contains fluid definitions l

Imprecise Definitions Mathematical inference assumes airtight definitions l Common sense contains fluid definitions l – – – Context-dependent Fuzzy Dynamic

Contingent statements l All birds can fly, except – – l Circumscription – l

Contingent statements l All birds can fly, except – – l Circumscription – l Penguins, ostriches, dead birds, injured birds, fictional birds, caged birds, … It’s true, unless you know otherwise Non-monotonic reasoning – It used to be true that all birds can fly, but now

Incomplete reasoning l Traditional logic looks for – – l Consistency (can’t prove a

Incomplete reasoning l Traditional logic looks for – – l Consistency (can’t prove a statement and its contradiction) Completeness Common sense inference is neither consistent nor complete

Incremental processing Most logical formalisms assume a “batch” process l You present assertions, queries,

Incremental processing Most logical formalisms assume a “batch” process l You present assertions, queries, then system cranks l With common sense apps, you might learn stuff while the system is inferring l The user might give you interactive feedback l

Breadth-first exploration Most logical inference (e. g. resolution theorem proving) is depth-first l Common

Breadth-first exploration Most logical inference (e. g. resolution theorem proving) is depth-first l Common sense is broad, not deep l What we want is that, if a simple answer exists, we will find it quickly l Best-first or most-relevant-first limits search l

If logic is broken, let’s fix it Non-monotonic logic and default logics l Circumscription,

If logic is broken, let’s fix it Non-monotonic logic and default logics l Circumscription, Situation Calculus l – Formalization of Context Fuzzy logic and probabilistic logics (e. g. Bayesian) l Multiple-valued logic (yes, no, maybe, dunno) l Modal logic (necessary, possible) l

Example-based approaches Go from specific to general rather than general to specific l Programming

Example-based approaches Go from specific to general rather than general to specific l Programming by Example l Case-Based Reasoning l Reasoning by Analogy l Abduction l

Causal Diversity

Causal Diversity

Maybe combine techniques?

Maybe combine techniques?

Common Sense vs. Statistical techniques Some large-scale, numerical and statistical techniques have achieved success

Common Sense vs. Statistical techniques Some large-scale, numerical and statistical techniques have achieved success recently l Will statistical techniques “run out”? l Not necessarily opposed to knowledge-based approaches l Could we use these techniques to “mine” Common Sense knowledge? l

Common Sense and the Semantic Web l l There’s now a movement to make

Common Sense and the Semantic Web l l There’s now a movement to make “The Semantic Web” -- turn the Web into the world’s largest knowledge base Could this be a vehicle for capturing or using Common Sense? We’ve got to untangle the Semantic Web formalisms Could this be a way to integrate disparate Common Sense architectures (to solve the software eng. problems of Minsky’s proposals)?

Thought. Treasure the hard common sense problem, and applications of common sense

Thought. Treasure the hard common sense problem, and applications of common sense

Talk plan l l l Thought. Treasure overview Thought. Treasure and the hard common

Talk plan l l l Thought. Treasure overview Thought. Treasure and the hard common sense problem Applications of common sense - Sensi. Cal - News. Forms

Thought. Treasure l l l Commonsense knowledge base Architecture for natural language understanding Uses

Thought. Treasure l l l Commonsense knowledge base Architecture for natural language understanding Uses multiple representations: logic, finite automata, grids, scripts

Thought. Treasure KB l l 35, 023 English words/phrases 21, 529 French words/phrases 51,

Thought. Treasure KB l l 35, 023 English words/phrases 21, 529 French words/phrases 51, 305 commonsense assertions 27, 093 concepts

Thought. Treasure architecture 80, 000 lines of code l l l Text agency Syntactic

Thought. Treasure architecture 80, 000 lines of code l l l Text agency Syntactic component Semantic component Generator Planning agency Understanding agency

Thought. Treasure applications l l Commonsense applications Simple factual question answering > What color

Thought. Treasure applications l l Commonsense applications Simple factual question answering > What color are elephants? They are gray. > What is the circumference of the earth? 40, 003, 236 meters. > Who created Bugs Bunny? Tex Avery created Bugs Bunny. l Story understanding

The hard problem: Story understanding Two robbers entered Gene Cook’s furniture store in Brooklyn

The hard problem: Story understanding Two robbers entered Gene Cook’s furniture store in Brooklyn and forced him to give them $1200. Who was in the store initially? And during the robbery? Did Cook want to be robbed? Did the robbers tell Cook their names? Did Cook know he was going to be robbed? Does he now know he was robbed? What crimes were committed? (adapted from http: //www-formal. stanford. edu/jmc/mrhug. html)

Thought. Treasure approach to story understanding 1. To build a computer that understands stories,

Thought. Treasure approach to story understanding 1. To build a computer that understands stories, build a computer that can construct simulations of the states and events described in the story. 20020217 T 194352 wwwwwwwww w v w ttt w Lttt. J w ttt w Actor Jim 49 PAs Sleep PA

Thought. Treasure approach to story understanding 2. Modularize the problem by having different agents

Thought. Treasure approach to story understanding 2. Modularize the problem by having different agents work on different parts of the simulation. Time UA Space UA 20020217 T 194352 wwwwwwwww w v w ttt w Lttt. J w ttt w Jim 49 Emotion UA Actor Jim 49 PAs Jim 49 Sleep UA Sleep PA

Story understanding by simulation l l l Read a story Maintain a simulation =

Story understanding by simulation l l l Read a story Maintain a simulation = model of story Answer questions

The debate in psychology l People reason using mental models Johnson-Laird, Philip N. (1993).

The debate in psychology l People reason using mental models Johnson-Laird, Philip N. (1993). Human and machine thinking. l People reason using inference rules Rips, Lance J. (1994). The psychology of proof.

The debate in AI l AI programs should use lucid representations Levesque, Hector (1986).

The debate in AI l AI programs should use lucid representations Levesque, Hector (1986). Making believers out of computers. l AI programs should use indirect representations Davis, Ernest (1991). (Against) Lucid representations. l AI programs should use diverse representations Minsky, Marvin (1986). The society of mind.

Story understanding by simulation l l A simulation is a sequence of states A

Story understanding by simulation l l A simulation is a sequence of states A state is a snapshot of the mental world of each story character and the physical world

Each input updates the simulation 5 v Off J Jim turned on the TV.

Each input updates the simulation 5 v Off J Jim turned on the TV. 5 6 v J Off v J On

Why this approach? l l Easy question answering by reading answers off the simulation

Why this approach? l l Easy question answering by reading answers off the simulation Convenient modularization by components of the simulation

Modularization One understanding agent per simulation component Space Time Actor – one per story

Modularization One understanding agent per simulation component Space Time Actor – one per story character Device – one per device Goal Emotion Sleep Watch TV – one per story character …

Understanding agents l l Space UA – Jim walked to the table. Emotion UA

Understanding agents l l Space UA – Jim walked to the table. Emotion UA – Jim was happy. Sleep UA – Jim was asleep. TV device agent – The TV was on.

Phone. Call planning agent off-hook INITIAL off-hook WAIT dialtone WAIT 30 sec on-hook dialed

Phone. Call planning agent off-hook INITIAL off-hook WAIT dialtone WAIT 30 sec on-hook dialed WAIT ring WAIT busy ringing retry WAIT 30 sec WAIT hello FINAL on-hook done say-bye talking talk say-hello picked up

Inferences from grids l l Distance between objects Relative position of objects (left, right,

Inferences from grids l l Distance between objects Relative position of objects (left, right, front, back) Whether two actors can see or hear each other Whethere is a path from one location to another

Thought. Treasure architecture Text agency Syntactic component “He falls asleep. . . ” filters

Thought. Treasure architecture Text agency Syntactic component “He falls asleep. . . ” filters discourse NOW 20020312 T 102344 SPEAKER Tom 62 LISTENER TT LANGUAGE English CONTEXTS generator parse trees syntactic parser nodes Input text Semantic component “Yes, . . . ” base rules anaphoric parser Understanding agency understanding agent assertions [. . . ] understanding agent sleep und. agent Planning agency context SENSE 0. 9 STORY TIME 20020217 T 115849 REPRESENTATIONS wwwww w J wwwww Actor Jim 49 PAs EMOTIONS Output text semantic parser planning agent OBJ [. . . ] planning agent OBJ [sleep Jim 49] STATE asleep

Simple stories handled by Thought. Treasure l l l Two children, Bertie and Lin,

Simple stories handled by Thought. Treasure l l l Two children, Bertie and Lin, in their playroom and bedrooms Jenny in her grocery store Jim waking up and taking a shower in his apartment

Story understanding by simulation > Lin walks to her playroom. Ts = 20020217 T

Story understanding by simulation > Lin walks to her playroom. Ts = 20020217 T 115849 wwwwwwwwwwwwwwwww w w BBBBw w Lin : -) awake w w wwwwwwwww w w LLLL w wwwwwwwwwwwwwwwww

Story understanding by simulation > She is where? She was in the playroom of

Story understanding by simulation > She is where? She was in the playroom of the apartment.

Story understanding by simulation > She walks to her bedroom. Ts = 20020217 T

Story understanding by simulation > She walks to her bedroom. Ts = 20020217 T 115902 wwwwwwwwwwwwwwwww w w BBBBw w. . . wwwwwwwww w. . . . Lin : -) awake w w w LLLL w wwwwwwwwwwwwwwwww

Story understanding by simulation > She is where? She was in her bedroom. >

Story understanding by simulation > She is where? She was in her bedroom. > She is near her bed? Yes, she is in fact near the bed.

Story understanding by simulation > She falls asleep. Ts = 20020217 T 120034 wwwwwwwwwwwwwwwww

Story understanding by simulation > She falls asleep. Ts = 20020217 T 120034 wwwwwwwwwwwwwwwww w w BBBBw w wwwwwwwwww w. w w w. LLL w w w Lin : -) asleep w wwwwwwwwwwwwwwwww

Story understanding by simulation > She is asleep? Yes, she was in fact asleep.

Story understanding by simulation > She is asleep? Yes, she was in fact asleep. > She is awake? No, she is not awake. > She was in the playroom when? She was in the playroom of the apartment two minutes ago.

Applications of common sense l l Sensi. Cal News. Forms

Applications of common sense l l Sensi. Cal News. Forms

Sensi. Cal l Smart calendar application Speeds entry by filling in information Points out

Sensi. Cal l Smart calendar application Speeds entry by filling in information Points out obvious blunders

Sensi. Cal

Sensi. Cal

Information extracted l l l Type of item Participants - role - name Location

Information extracted l l l Type of item Participants - role - name Location - venue name and type - earth coordinates

Information extracted Text: lunch w/lin at frank's steakhouse Start. Ts: 20020528 T 120000 End.

Information extracted Text: lunch w/lin at frank's steakhouse Start. Ts: 20020528 T 120000 End. Ts: 20020528 T 130000 Item. Type: Meal. Type: Participant: Venue. Type: meal lunch Lin Frank's Steakhouse steakhouse

Common sense of calendaring l l l l People grocery shop in grocery stores

Common sense of calendaring l l l l People grocery shop in grocery stores Vegetables are found in grocery stores A person can’t be in two places at once Allow sufficient time to travel from one location to another People typically work during the day and sleep at night People do not usually attend business meetings on holidays Lunch is eaten around noon for an hour Don’t eat at restaurants that serve mostly food you avoid Vegetarians avoid meat A steak house serves beef Beef is meat You can’t visit a place that’s not open Restaurants do not generally serve dinner after 11 pm Museums are often closed on Mondays …

Representation of common sense in Thought. Treasure l l Assertions Scripts Grids Procedures -

Representation of common sense in Thought. Treasure l l Assertions Scripts Grids Procedures - Trip planning agent - Path planner

Assertions [performed-in eat-dinner restaurant] [serve-meal steakhouse beef] [avoid vegetarian meat]

Assertions [performed-in eat-dinner restaurant] [serve-meal steakhouse beef] [avoid vegetarian meat]

Scripts l Roles [role-of visit-museum-goer] [role-of wedding-ceremony bride] l Sequence of events [event 01

Scripts l Roles [role-of visit-museum-goer] [role-of wedding-ceremony bride] l Sequence of events [event 01 -of go-blading [put-on blader rollerblades]] [event 02 -of go-blading [blader]] l Time and duration [min-value-of eat-lunch 11 am] [max-value-of eat-lunch 2 pm] [duration-of eat-lunch 1 hrs]

Scripts l Cost [cost-of take-subway $1] l Entry conditions [entry-condition-of sleep [sleepy sleeper]] l

Scripts l Cost [cost-of take-subway $1] l Entry conditions [entry-condition-of sleep [sleepy sleeper]] l Goals [goal-of telephone-call [talk calling-party called-party]]

Grids VVVVVSSSwwwwwwwwwwwwwwwwwwwwwwwwww VVVVVSS 2 q fnffw VVVVVSSSq fxffw VVVVVSSSw ttt ffffmffffw VVVVVSSSw ttt. B

Grids VVVVVSSSwwwwwwwwwwwwwwwwwwwwwwwwww VVVVVSS 2 q fnffw VVVVVSSSq fxffw VVVVVSSSw ttt ffffmffffw VVVVVSSSw ttt. B sssssssssssssss ffffw VVVVVSSSw c. Artt sssssssssssssss fuffw VVVVVSSSw ttt z y ffffw VVVVVSSSw 1 fpffw VVVVVSSSw ffffw VVVVVSSSwwwwwwwwwwwwwwwwwwwwwwwwww. A: employee-side-of-counter B: customer-side-of-counter c: side-chair f: store-refrigerator m: Sprite n: carrot n: greenhouse-lettuce n: tomato n: onion …

Sensi. Cal status l l Prototype implemented in Tcl and Perl as extension to

Sensi. Cal status l l Prototype implemented in Tcl and Perl as extension to ical Communicates with Thought. Treasure using Thought. Treasure server protocol

Thought. Treasure vs. Cyc l l l Thought. Treasure inspired by Cyc mostly uses

Thought. Treasure vs. Cyc l l l Thought. Treasure inspired by Cyc mostly uses single representation (logic); Thought. Treasure uses multiple representations (logic, finite automata, grids) Recovery of script information from Cyc is difficult: (=> (and (isa (sub. Events ? X ? U) (isa ? U Staining)) ? X Wood. Refinishing)) (isa ? U Shaping. Something) (sub. Events ? U ? X)) ? X Cutting. Something)) Thought. Treasure scripts are easy to use: [event-of refinish-wood [stain human wood]] [event-of shape [cut human physical-object]]

Thought. Treasure vs. Open. Cyc Analysis of nonlinguistic assertions http: //www. signiform. com/tt/htm/opencyctt. htm

Thought. Treasure vs. Open. Cyc Analysis of nonlinguistic assertions http: //www. signiform. com/tt/htm/opencyctt. htm Open. Cyc 0. 6. 0 Thought. Treasure 0. 00022 Hierarchical Typing Spatial Script Part Property Other 62% 33% 0% 0% 5% 56% 2% 4% 4% 2% 1% 31% Assertions 60, 878 51, 305