Linear Classification CSE 6363 Machine Learning Vassilis Athitsos

  • Slides: 42
Download presentation
Linear Classification CSE 6363 – Machine Learning Vassilis Athitsos Computer Science and Engineering Department

Linear Classification CSE 6363 – Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1

Example of Linear Classification • Red points: patterns belonging to class C 1. •

Example of Linear Classification • Red points: patterns belonging to class C 1. • Blue points: patterns belonging to class C 2. • Goal: find a linear decision boundary separating C 1 from C 2. • Points on one side of the line will be classified as belonging to C 1, points on the other side will be classified as C 2. • The red line is one example of such a decision boundary. – It misclassified a few patterns. • The green line is another example. 2

Linear Classification • Mathematically, assuming input patterns are D-dimensional vectors: – We are looking

Linear Classification • Mathematically, assuming input patterns are D-dimensional vectors: – We are looking for a decision boundary in the form of a (D-1)-dimensional hyperplane separating the two classes. – Points on one side of the hyperplane will be classified as belonging to C 1, points on the other side will be classified as C 2. • If inputs are 2 -dimensional vectors, the decision boundary is a line. • If inputs are 3 -dimensional vectors, the decision boundary is a 23 dimensional surface.

Linear Classification • 4

Linear Classification • 4

Linear Classification • 5

Linear Classification • 5

Linear Classification • 6

Linear Classification • 6

Logistic Regression • 7

Logistic Regression • 7

Logistic Regression • 8

Logistic Regression • 8

Logistic Regression • 9

Logistic Regression • 9

Finding the Most Likely Solution • 10

Finding the Most Likely Solution • 10

Finding the Most Likely Solution • 11

Finding the Most Likely Solution • 11

Sequential Learning • 20

Sequential Learning • 20

Sequential Learning • 21

Sequential Learning • 21

Sequential Learning - Intuition • 22

Sequential Learning - Intuition • 22

Sequential Learning - Intuition • 23

Sequential Learning - Intuition • 23

Iterative Reweighted Least Squares • 24

Iterative Reweighted Least Squares • 24

Iterative Reweighted Least Squares • 25

Iterative Reweighted Least Squares • 25

Iterative Reweighted Least Squares • 26

Iterative Reweighted Least Squares • 26

Iterative Reweighted Least Squares • 27

Iterative Reweighted Least Squares • 27

Iterative Reweighted Least Squares • 28

Iterative Reweighted Least Squares • 28

Logistic Regresssion: Recap • 29

Logistic Regresssion: Recap • 29

Fisher's Linear Discriminant • 30

Fisher's Linear Discriminant • 30

Fisher's Linear Discriminant • 31

Fisher's Linear Discriminant • 31

Fisher's Linear Discriminant • 32

Fisher's Linear Discriminant • 32

Fisher's Linear Discriminant • Goal in Fisher's Linear Discriminant: find the line projection that

Fisher's Linear Discriminant • Goal in Fisher's Linear Discriminant: find the line projection that maximizes the separation of the classes. • Key question: how do we measure separation of the classes? • One simple (but not very useful) answer: maximize the separation of the means of the classes. 33

Maximizing Separation of Means • 34

Maximizing Separation of Means • 34

Maximizing Separation of Means • 35

Maximizing Separation of Means • 35

Maximizing Separation of Means • 36

Maximizing Separation of Means • 36

Maximizing Separation of Means • 37

Maximizing Separation of Means • 37

Maximizing Separation of Means • 38

Maximizing Separation of Means • 38

Between Class Variance, Within-Class Variance, • 39

Between Class Variance, Within-Class Variance, • 39

Fisher Criterion • 40

Fisher Criterion • 40