Knowledge Representation Introduction Some Representations Elements of a

  • Slides: 44
Download presentation
Knowledge Representation (Introduction)

Knowledge Representation (Introduction)

Some Representations

Some Representations

Elements of a Representation • • Represented world: about what? Representing world: using what?

Elements of a Representation • • Represented world: about what? Representing world: using what? Representing rules: how to map? Process that uses the representation: conventions and systems that use the representations resulting from above. A representation will always be missing some property of the real world entity it is trying to model.

Marr’s levels of description • Computational: What is the goal of the computation, why

Marr’s levels of description • Computational: What is the goal of the computation, why is it appropriate, and what is the logic of the strategy by which it can be carried out? • Algorithmic: How can this computational theory be implemented? In particular, what is the representation for the input and output, and what is the algorithm for the transformation? • Implementation: How can the representation and algorithm be realized physically?

Marr’s levels of description cont. • Computational: a lot of cognitive psychology • Algorithmic:

Marr’s levels of description cont. • Computational: a lot of cognitive psychology • Algorithmic: a lot of cognitive science • Implementation: neuroscience

Overview • How knowledge representation works – Basics of logic (connectives, model theory, meaning)

Overview • How knowledge representation works – Basics of logic (connectives, model theory, meaning) • Basics of knowledge representation – Why use logic instead of natural language? – Quantifiers – Organizing large knowledge bases • Ontology • Microtheories • Resource: Open. Cyc, Word. Net, UMLS

The CYC Project • It now ‘knows’ a huge collection of fragments of real-world

The CYC Project • It now ‘knows’ a huge collection of fragments of real-world knowledge such as: – – Mothers are older than their children. You have to be awake to eat. You can usually know people’s noses, but not their hearts. If you cut a lump of peanut butter in half, each half is a lump of peanut butter, but if you cut a table in half, neither half is a table.

The CYC Project • The ultimate objective is to give it enough knowledge to

The CYC Project • The ultimate objective is to give it enough knowledge to understand ordinary books, so that it can read them and expand its own knowledge. • So far, it’s got to a stage where, when asked to find photos of “risky activities”, it located photos of people climbing mountains and doing white-water rafting.

How Knowledge Representation Works • Intelligence requires knowledge • Computational models of intelligence require

How Knowledge Representation Works • Intelligence requires knowledge • Computational models of intelligence require models of knowledge • Use formalisms to write down knowledge – Expressive enough to capture human knowledge – Precise enough to be understood by machines • Separate knowledge from computational mechanisms that process it – Important part of cognitive model is what the organism knows

How knowledge representations are used in cognitive models • Contents of KB is part

How knowledge representations are used in cognitive models • Contents of KB is part of cognitive model • Some models hypothesize multiple knowledge bases. Questions, requests Answers, analyses Inference Mechanism(s) Examples, Statements Learning Mechanism(s) Knowledge Base

Expert Systems • KBS on a domain that requires expertise (mainly rulebased, but not

Expert Systems • KBS on a domain that requires expertise (mainly rulebased, but not always) • Rules capture “shallow” knowledge; alternative is to reason about “first principles” or “deep” knowledge (e. g. “Ohm’s law”) • Handling “uncertainty” in rules MYCIN: “certainty factor”

Knowledge Representation: revisited • Production Rules if leaves are yellowed AND soil is moist

Knowledge Representation: revisited • Production Rules if leaves are yellowed AND soil is moist AND small white spots on undersides of leaves then plant is infested with spider mites treat with DIANTHONINE • Predicate Calculus color(ball, red). • Semantic net ball 23 color red

Semantic Net bill “Bill is a cat. ” inst cat species isa has-part feline

Semantic Net bill “Bill is a cat. ” inst cat species isa has-part feline mammal isa isa animal has-part dog isa has-part foot claw hair paw species canine living

Knowledge Representation using structured objects Frames

Knowledge Representation using structured objects Frames

Frames • Devised by Marvin Minsky, 1974. • Incorporates certain valuable human thinking characteristics:

Frames • Devised by Marvin Minsky, 1974. • Incorporates certain valuable human thinking characteristics: – Expectations, assumptions, stereotypes. Exceptions. Fuzzy boundaries between classes. • The essence of this form of knowledge representation is typicality, with exceptions, rather than definition.

Example: Frame System for a “typical” room

Example: Frame System for a “typical” room

How frames are organized • A frame system is a hierarchy of frames •

How frames are organized • A frame system is a hierarchy of frames • Each frame has: – a name. – slots: these are the properties of the entity that has the name, and they have values. A particular value may be: • • a default value an inherited value from a higher frame a procedure, called a daemon, to find a value a specific value, which might represent an exception.

Frames: some examples • We will start with a simple piece of information: there

Frames: some examples • We will start with a simple piece of information: there is a category of things called cars. • Given this information, we can start to build a frame:

Name: car Subclass of: thing

Name: car Subclass of: thing

 • More information: a car has 4 wheels, is moved by an engine,

• More information: a car has 4 wheels, is moved by an engine, and runs on petrol or diesel. • We can now add three slots to the frame. • The last of these has a restriction rather than a specific value.

“a car has 4 wheels, is moved by an engine, and runs on petrol

“a car has 4 wheels, is moved by an engine, and runs on petrol or diesel. ” Name: car Name: Subclass of: Slots: Value: wheels 4 moved by engine fuel ? thing Restrictions: petrol or diesel car subclass_of thing with wheels: 4, moved_by: engine, fuel: [value: unknown, type: [petrol, diesel]].

 • More information: there is a particular type of car called a VW,

• More information: there is a particular type of car called a VW, manufactured in Germany. • We can add a second frame to our system, with one slot. We don’t need to repeat the slots and values in the previous frame: they will be inherited.

“there is a particular type of car called a VW, manufactured in Germany. ”

“there is a particular type of car called a VW, manufactured in Germany. ” Name: VW Name: made in Subclass of: Slots: Value: Germany car Restrictions: ‘VW’ subclass_of car with made_in: ‘Germany’.

 • More information: there is a particular type of VW called a Golf,

• More information: there is a particular type of VW called a Golf, which has a sun-roof. • We can add a third frame to our system, with one slot. Once again, we don’t repeat the slots in the previous frames, because they will be inherited.

“there is a particular type of VW called a Golf, which has a sunroof.

“there is a particular type of VW called a Golf, which has a sunroof. ” Name: Golf Name: top Subclass of: Slots: Value: sunroof VW Restrictions: ‘Golf’ subclass_of VW with top: sunroof.

 • More information: there is a particular type of Golf called a TDi,

• More information: there is a particular type of Golf called a TDi, which runs on diesel. A TDi has 4 cylinders, and an engine capacity of 1. 8 litres. • We can add a fourth frame to our system, with three slots. One of the slots (fuel) was already in the system, but appears here because it now has a specific value rather than a restriction.

“there is a particular type of Golf called a TDi, which runs on diesel,

“there is a particular type of Golf called a TDi, which runs on diesel, has 4 cylinders, and has a 1. 8 litre engine. ” Name: TDi Name: Subclass of: Slots: Value: fuel diesel engine capacity 1. 8 litres cylinders 4 Golf Restrictions: ‘TDi’ subclass_of ‘Golf’ with fuel: diesel, engine_capacity: 1. 8, cylinders: 4.

Scripts • Knowledge representation researchers particularly Roger Schank and his associates devised some interesting

Scripts • Knowledge representation researchers particularly Roger Schank and his associates devised some interesting variations on theme of structured objects. • In particular, they invented the idea of scripts (1973). • A script is a description of a class of events in terms of contexts, participants, and sub-events.

Plans and Scripts Roger Schank Example: Restaurant Script “John went to a restaurant. He

Plans and Scripts Roger Schank Example: Restaurant Script “John went to a restaurant. He ordered lobster. He paid the bill and left. ” What did John eat?

Scripts • Rather similar to frames: uses inheritance and slots; describes stereotypical knowledge, (i.

Scripts • Rather similar to frames: uses inheritance and slots; describes stereotypical knowledge, (i. e. if the system isn't told some detail of what's going on, it assumes the "default" information is true), but concerned with events. • Somewhat out of the mainstream of expert systems work. More a development of natural-languageprocessing research.

Scripts • Why represent knowledge in this way? – Because real-world events do follow

Scripts • Why represent knowledge in this way? – Because real-world events do follow stereotyped patterns. Human beings use previous experiences to understand verbal accounts; computers can use scripts instead. – Because people, when relating events, do leave large amounts of assumed detail out of their accounts. People don't find it easy to converse with a system that can't fill in missing conversational detail.

Scripts • Scripts predict unobserved events. • Scripts can build a coherent account from

Scripts • Scripts predict unobserved events. • Scripts can build a coherent account from disjointed observations.

Scripts • Commercial applications of script-like structured objects: work on the basis that a

Scripts • Commercial applications of script-like structured objects: work on the basis that a conversation between two people on a pre-defined subject will follow a predictable course. • Certain items of information need to be exchanged. – Others can be left unsaid (because both people know what the usual answer would be, or can deduce it from what's been said already), unless (on this occasion) it's an unusual answer.

What’s in a knowledge base? • Facts about the specifics of the world –

What’s in a knowledge base? • Facts about the specifics of the world – Fordham is a private university – The first thing Andrea did at the party was talk to John. • Rules (aka axioms) that describe ways to infer new facts from existing facts – All triangles have three sides – All elephants are grey • Facts and rules are stated in a formal language – Generally some form of logic (aka predicate calculus)

Propositional logic • A step towards understanding predicate calculus • Statements are just atomic

Propositional logic • A step towards understanding predicate calculus • Statements are just atomic propositions, with no structure – Propositions can be true or false • Statements can be made into larger statements via logical connectives. • Examples: – C = “It’s cold outside” ; C is a proposition – O = “It’s October” ; O is a proposition – If O then C ; if it’s October then it’s cold outside

Model Theory • Meaning of a theory = set of models that satisfy it.

Model Theory • Meaning of a theory = set of models that satisfy it. – Model = set of objects and relationships – If statement is true in KB, then the corresponding relationship(s) hold between the corresponding objects in the modeled world – The objects and relationships in a model can be formal constructs, or pieces of the physical world, or whatever • Meaning of a predicate = set of things in the models for that theory which correspond to it. – E. g. , above means “above”, sort of

Representations as Sculptures • How does one make a statue of an elephant? –

Representations as Sculptures • How does one make a statue of an elephant? – Start with a marble block. Carve away everything that does not look like an elephant. • How does one represent a concept? – Start with a vocabulary of predicates and other axioms. Add axioms involving the new predicate until it fits your intended model well. • Knowledge representation is an evolutionary process – It isn’t quick, but incremental additions lead to incremental progress – All representations are by their nature imperfect

NL vs. Logic: Expressiveness NL: Jim’s injury resulted from his falling. Jim’s falling caused

NL vs. Logic: Expressiveness NL: Jim’s injury resulted from his falling. Jim’s falling caused his injury. Jim’s injury was a consequence of his falling. Jim’s falling occurred before his injury. NL: Write the rule for every expression? Logic: identify the common concepts, e. g. the relation: x caused y Write rules about the common concepts, e. g. x caused y x temporally precedes y

NL vs. Logic: Ambiguity and Precision NL: Ambiguous • x is at the bank.

NL vs. Logic: Ambiguity and Precision NL: Ambiguous • x is at the bank. • x is running. • river bank? • changing location? • financial institution? • operating? • a candidate for office? Logic: Precise x is running-In. Motion x is changing location x is running-Device. Operating x is operating x is running-As. Candidate x is a candidate Reasoning: Figuring out what must be true, given what is known. Requires precision of meaning.

NL vs. Logic: Calculus of Meaning Logic: Well-understood operators enable reasoning: Logical constants: not,

NL vs. Logic: Calculus of Meaning Logic: Well-understood operators enable reasoning: Logical constants: not, and, or, all, some Not (All men are taller than all women). All men are taller than 12”. Some women are taller than 12”. Not (All A are F than all B). All A are F than x. Some B are F than x.

Syntax: Terms (aka Constants) Terms denote specific individuals or collections (relations, people, computer programs,

Syntax: Terms (aka Constants) Terms denote specific individuals or collections (relations, people, computer programs, types of cars. . . ) Each Terms is a character string prefixed by • A sampling of some constants: – Dog, Snow. Skiing, Physical. Attribute These denote collections – Bill. Clinton, Rover, Disney. Land. Tourist. Attraction – likes. As. Friend, borders. On, object. Has. Color, and, not, implies, for. All These denote individuals : • Partially Tangible Individuals • Relations – Red. Color, Soil-Sandy • Attribute Values

Syntax: Propositions: a relation applied to some arguments Also called formulas, sentences… • Examples:

Syntax: Propositions: a relation applied to some arguments Also called formulas, sentences… • Examples: – (isa George. WBush Person) – (likes. As. Friend George. WBush Al. Gore)

Why constraints are important • They guide reasoning – (performed. By Painting. The. House

Why constraints are important • They guide reasoning – (performed. By Painting. The. House Brick 2) – (performed. By Martha. Stewart Cooking. APie) • They constrain learning

Variables and Quantifiers • General statements can be made by using variables and quantifiers

Variables and Quantifiers • General statements can be made by using variables and quantifiers – Variables in logic are like variables in algebra • Sentences involving concepts like “everybody, ” “something, ” and “nothing” require variables and quantifiers: Everybody loves somebody. Nobody likes spinach. Some people like spinach and some people like broccoli, but no one likes them both.