Linear Classification CSE 6363 Machine Learning Vassilis Athitsos
- Slides: 42
Linear Classification CSE 6363 – Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1
Example of Linear Classification • Red points: patterns belonging to class C 1. • Blue points: patterns belonging to class C 2. • Goal: find a linear decision boundary separating C 1 from C 2. • Points on one side of the line will be classified as belonging to C 1, points on the other side will be classified as C 2. • The red line is one example of such a decision boundary. – It misclassified a few patterns. • The green line is another example. 2
Linear Classification • Mathematically, assuming input patterns are D-dimensional vectors: – We are looking for a decision boundary in the form of a (D-1)-dimensional hyperplane separating the two classes. – Points on one side of the hyperplane will be classified as belonging to C 1, points on the other side will be classified as C 2. • If inputs are 2 -dimensional vectors, the decision boundary is a line. • If inputs are 3 -dimensional vectors, the decision boundary is a 23 dimensional surface.
Linear Classification • 4
Linear Classification • 5
Linear Classification • 6
Logistic Regression • 7
Logistic Regression • 8
Logistic Regression • 9
Finding the Most Likely Solution • 10
Finding the Most Likely Solution • 11
Sequential Learning • 20
Sequential Learning • 21
Sequential Learning - Intuition • 22
Sequential Learning - Intuition • 23
Iterative Reweighted Least Squares • 24
Iterative Reweighted Least Squares • 25
Iterative Reweighted Least Squares • 26
Iterative Reweighted Least Squares • 27
Iterative Reweighted Least Squares • 28
Logistic Regresssion: Recap • 29
Fisher's Linear Discriminant • 30
Fisher's Linear Discriminant • 31
Fisher's Linear Discriminant • 32
Fisher's Linear Discriminant • Goal in Fisher's Linear Discriminant: find the line projection that maximizes the separation of the classes. • Key question: how do we measure separation of the classes? • One simple (but not very useful) answer: maximize the separation of the means of the classes. 33
Maximizing Separation of Means • 34
Maximizing Separation of Means • 35
Maximizing Separation of Means • 36
Maximizing Separation of Means • 37
Maximizing Separation of Means • 38
Between Class Variance, Within-Class Variance, • 39
Fisher Criterion • 40
- Vassilis athitsos
- Athitsos
- Vassilis athitsos
- Vassilis athitsos
- Vassilis athitsos
- Vassilis athitsos
- Vassilis athitsos
- Vassilis athitsos
- 877-626-6363
- Awareio
- Cost function andrew ng
- Linear regression multiple features
- Linear regression with multiple variables machine learning
- Product classification machine learning
- Classification
- Concept learning task in machine learning
- Analytical learning in machine learning
- Pac learning model in machine learning
- Pac learning model in machine learning
- Inductive and analytical learning
- Analytical learning vs inductive learning
- Instance based learning in machine learning
- Inductive learning machine learning
- First order rule learning in machine learning
- Eager learning vs lazy learning
- Cmu machine learning
- Cuadro comparativo e-learning m-learning b-learning
- Finite state machine vending machine example
- Mealy or moore machine
- Moore machine to mealy machine
- Ma=fr/fe
- The following is the metal cutting process
- Simple linear regression and multiple linear regression
- Contoh persamaan non linear
- Non-linear text
- Non linear narrative definition
- Soal metode numerik
- Linear pipeline and non linear pipeline
- Disadvantages of linear multimedia
- A left linear grammar is always
- Contoh soal fungsi non linear hiperbola
- Bentuk fungsi non linier
- Linearly independent