Learning From Examples AIMA Chapter 18 Outline Motivation

  • Slides: 49
Download presentation
Learning From Examples AIMA Chapter 18

Learning From Examples AIMA Chapter 18

Outline • Motivation for Learning • Supervised Learning • Expert Systems 2

Outline • Motivation for Learning • Supervised Learning • Expert Systems 2

Learning – What and Why? • 3

Learning – What and Why? • 3

ML is Everywhere! Natural Language Processing Computational Biology Targeted Advertising Face Recognition

ML is Everywhere! Natural Language Processing Computational Biology Targeted Advertising Face Recognition

Designing Learning Elements Design of learning elements • What are we learning? • How

Designing Learning Elements Design of learning elements • What are we learning? • How is data represented? • How do we get performance feedback? Feedback • Supervised learning: each example is labeled • Unsupervised learning: correct answers not given • Reinforcement learning: occasional rewards given 5

Supervised Learning • The Data: • • Problem • Performance: • • 6

Supervised Learning • The Data: • • Problem • Performance: • • 6

Supervised Learning – a Probabilistic Take • Samples • • Problem • Performance: •

Supervised Learning – a Probabilistic Take • Samples • • Problem • Performance: • • 7

Probably Approximately Correct Learning • PAC Learning • • 8

Probably Approximately Correct Learning • PAC Learning • • 8

Learning a Classifier •

Learning a Classifier •

I. II. Simple Explanations – “Occam’s Razor” vs. Low error rate

I. II. Simple Explanations – “Occam’s Razor” vs. Low error rate

Choosing Simple Hypotheses • Generalize better in the presence of noisy data • Faster

Choosing Simple Hypotheses • Generalize better in the presence of noisy data • Faster to search through simple hypothesis space • Easier and faster to use simple hypothesis 11

Linear Classifiers •

Linear Classifiers •

Linear Classifiers Question: given a dataset, how would we determine which linear classifier is

Linear Classifiers Question: given a dataset, how would we determine which linear classifier is “good”?

Least Squared Error •

Least Squared Error •

Support Vector Machines Many candidates for a linear function minimizing LSE; which one should

Support Vector Machines Many candidates for a linear function minimizing LSE; which one should we pick?

Support Vector Machines Question: why is the middle line “good”?

Support Vector Machines Question: why is the middle line “good”?

Support Vector Machines One possible approach: find a hyperplane that is “perturbation resistant”

Support Vector Machines One possible approach: find a hyperplane that is “perturbation resistant”

Support Vector Machines •

Support Vector Machines •

Support Vector Machines • “Maximize the margin, but do not misclassify!”

Support Vector Machines • “Maximize the margin, but do not misclassify!”

Support Vector Machines •

Support Vector Machines •

Regret Minimization

Regret Minimization

Regret Minimization •

Regret Minimization •

Regret We want to do well – benchmark against something! 1. Do at least

Regret We want to do well – benchmark against something! 1. Do at least as well as the best algorithm? 2. Do at least as well as the best expert?

Best algorithm: pick expert 1 in rounds 1 & 2, and expert 2 in

Best algorithm: pick expert 1 in rounds 1 & 2, and expert 2 in rounds 3 & 4 Best expert: expert 2 did the best in hindsight!

Regret Minimization Definitions • • Regret • •

Regret Minimization Definitions • • Regret • •

Round 1 – Greedy algorithm •

Round 1 – Greedy algorithm •

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Proof:

Round 2 – Randomized Greedy algorithm • •

Round 2 – Randomized Greedy algorithm • •

Multiplicative Weights Updates “The multiplicative weights algorithm was such a great idea, that it

Multiplicative Weights Updates “The multiplicative weights algorithm was such a great idea, that it was discovered three times” – C. Papadimitriou [Seminar Talk] Multiplicative Weights Update • • •

Discussion Points •

Discussion Points •