Linear Classification CSE 4309 Machine Learning Vassilis Athitsos
- Slides: 42
Linear Classification CSE 4309 – Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1
Example of Linear Classification • Red points: patterns belonging to class C 1. • Blue points: patterns belonging to class C 2. • Goal: find a linear decision boundary separating C 1 from C 2. • Points on one side of the line will be classified as belonging to C 1, points on the other side will be classified as C 2. • The red line is one example of such a decision boundary. – It misclassified a few patterns. • The green line is another example. 2
Linear Classification • Mathematically, assuming input patterns are D-dimensional vectors: – We are looking for a decision boundary in the form of a (D-1)-dimensional hyperplane separating the two classes. – Points on one side of the hyperplane will be classified as belonging to C 1, points on the other side will be classified as C 2. • If inputs are 2 -dimensional vectors, the decision boundary is a line. • If inputs are 3 -dimensional vectors, the decision boundary is a 23 dimensional surface.
Linear Classification • 4
Linear Classification • 5
Linear Classification • 6
Logistic Regression • 7
Logistic Regression • 8
Logistic Regression • 9
Finding the Most Likely Solution • 10
Finding the Most Likely Solution • 11
Sequential Learning • 20
Sequential Learning • 21
Sequential Learning - Intuition • 22
Sequential Learning - Intuition • 23
Iterative Reweighted Least Squares • 24
Iterative Reweighted Least Squares • 25
Iterative Reweighted Least Squares • 26
Iterative Reweighted Least Squares • 27
Iterative Reweighted Least Squares • 28
Logistic Regresssion: Recap • 29
Fisher's Linear Discriminant • 30
Fisher's Linear Discriminant • 31
Fisher's Linear Discriminant • 32
Fisher's Linear Discriminant • Goal in Fisher's Linear Discriminant: find the line projection that maximizes the separation of the classes. • Key question: how do we measure separation of the classes? • One simple (but not very useful) answer: maximize the separation of the means of the classes. 33
Maximizing Separation of Means • 34
Maximizing Separation of Means • 35
Maximizing Separation of Means • 36
Maximizing Separation of Means • 37
Maximizing Separation of Means • 38
Between Class Variance, Within-Class Variance, • 39
Fisher Criterion • 40
- Vassilis athitsos
- Athitsos
- Vassilis athitsos
- Vassilis athitsos
- Vassilis athitsos
- Passive reinforcement learning
- Vassilis athitsos
- Fisher's
- Awareio
- Andrew ng linear regression
- Gradient descent multiple variables
- Linear regression with multiple variables machine learning
- Product classification machine learning
- Classification
- Concept learning task in machine learning
- Analytical learning in machine learning
- Pac learning model in machine learning
- Pac learning model in machine learning
- Inductive and analytical learning in machine learning
- Focl in machine learning
- Instance based learning in machine learning
- Inductive learning machine learning
- First order rule learning in machine learning
- Eager learning algorithm example
- Cmu machine learning
- Cuadro comparativo de e-learning
- Finite state machine vending machine example
- Moore machine
- Moore machine
- Differentiate between simple machine and compound machine
- Bed in laser cutter contains mcq
- Simple linear regression and multiple linear regression
- Contoh soal biseksi
- Types of non linear text
- Non-linear narrative definition
- Contoh soal persamaan non linier metode biseksi
- Pipeline is a linear.
- Multimedia definition
- Left linear grammar to right linear grammar
- Contoh soal fungsi non linear
- Pengertian fungsi linear dan non linear
- What is linear dependence and independence
- Linear algebra linear transformation