Neural Networks Part 1 Introduction CSE 4309 Machine
- Slides: 40
Neural Networks Part 1 - Introduction CSE 4309 – Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1
Perceptrons • 2
Perceptrons • 3
Perceptrons • 4
Perceptrons • 5
Perceptrons and Neurons • Perceptrons are inspired by neurons. – Neurons are the cells forming the nervous system, and the brain. – Neurons somehow sum up their inputs, and if the sum exceeds a threshold, they "fire". • Since brains are "intelligent", computer scientists have been hoping that perceptron-based systems can be used to model intelligence. 6
Activation Functions • 7
Activation Functions • 8
Example: The AND Perceptron • • Suppose we use the step function for activation. Suppose boolean value false is represented as number 0. Suppose boolean value true is represented as number 1. Then, the perceptron below computes the boolean AND function: false AND false = false AND true = false true AND false = false true AND true = true 9
Example: The AND Perceptron • false AND false = false AND true = false true AND false = false true AND true = true 10
Example: The AND Perceptron • false AND false = false AND true = false true AND false = false true AND true = true 11
Example: The AND Perceptron • false AND false = false AND true = false true AND false = false true AND true = true 12
Example: The AND Perceptron • false AND false = false AND true = false true AND false = false true AND true = true 13
Example: The OR Perceptron • • Suppose we use the step function for activation. Suppose boolean value false is represented as number 0. Suppose boolean value true is represented as number 1. Then, the perceptron below computes the boolean OR function: false OR false = false OR true = true OR false = true OR true = true 14
Example: The OR Perceptron • false OR false = false OR true = true OR false = true OR true = true 15
Example: The OR Perceptron • false OR false = false OR true = true OR false = true OR true = true 16
Example: The OR Perceptron • false OR false = false OR true = true OR false = true OR true = true 17
Example: The OR Perceptron • false OR false = false OR true = true OR false = true OR true = true 18
Example: The NOT Perceptron • • Suppose we use the step function for activation. Suppose boolean value false is represented as number 0. Suppose boolean value true is represented as number 1. Then, the perceptron below computes the boolean NOT function: NOT(false) = true NOT(true) = false 19
Example: The NOT Perceptron • NOT(false) = true NOT(true) = false 20
Example: The NOT Perceptron • NOT(false) = true NOT(true) = false 21
The XOR Function false XOR false = false XOR true = true XOR false = true XOR true = false • As before, we represent false with 0 and true with 1. • The figure shows the four input points of the XOR function. – red corresponds to output value true. – green corresponds to output value false. • The two classes (true and false) are not linearly separable. • Therefore, no perceptron can compute the XOR function. 22
Our First Neural Network: XOR • A neural network is built using perceptrons as building blocks. • The inputs to some perceptrons are outputs of other perceptrons. • Here is an example neural network computing the XOR function. Unit 3 Unit Output: 5 Unit 4 23
Our First Neural Network: XOR • Unit 3 Unit Output: 5 Unit 4 24
Our First Neural Network: XOR • The XOR network shows how individual perceptrons can be combined to perform more complicated functions. OR unit A AND (NOT B) AND unit Output: 25
Computing the Output: An Example • OR unit A AND (NOT B) AND unit Output: 26
Computing the Output: An Example • OR unit A AND (NOT B) AND unit Output: 27
Computing the Output: An Example • OR unit A AND (NOT B) AND unit Output: 28
Computing the Output: An Example • OR unit A AND (NOT B) AND unit Output: 29
Verifying the XOR Network • OR unit A AND (NOT B) AND unit Output: 30
Verifying the XOR Network • OR unit A AND (NOT B) AND unit Output: 31
Verifying the XOR Network • OR unit A AND (NOT B) AND unit Output: 32
Neural Networks • This neural network example consists of six units: – Three input units (including the not-shown bias input). – Three perceptrons. • Yes, in the notation we will be using, inputs count as units. Unit 3 Unit Output: 5 Unit 4 33
Neural Networks • Unit 3 Unit Output: 5 Unit 4 34
Neural Network Layers • Oftentimes, neural networks are organized into layers. • The input layer is the initial layer of input units (units 0, 1, 2 in our example). – Unit 0 is the bias input, not shown. • The output layer is at the end (unit 5 in our example). • Zero, one or more hidden layers can be between the input and output layers. Unit 3 Unit Output: 5 Unit 4 35
Neural Network Layers • • • There is only one hidden layer in our example, containing units 3 and 4. Each hidden layer's inputs (except bias inputs) are outputs from previous layer. Each hidden layer's outputs are inputs to the next layer. The first hidden layer's inputs come from the input layer. The last hidden layer's outputs are inputs to the output layer. Unit 3 Unit Output: 5 Unit 4 36
Feedforward Networks • Feedforward networks are networks where there are no directed loops. • If there are no loops, the output of a neuron cannot (directly or indirectly) influence its input. • While there are varieties of neural networks that are not feedforward or layered, our main focus will be layered feedforward networks. Unit 3 Unit Output: 5 Unit 4 37
Computing the Output • Unit 3 Unit Output: 5 Unit 4 38
Computing the Output • Unit 3 Unit Output: 5 Unit 4 39
What Neural Networks Can Compute • An individual perceptron is a linear classifier. – The weights of the perceptron define a linear boundary between two classes. • Layered feedforward neural networks with one hidden layer can compute any continuous function. • Layered feedforward neural networks with two hidden layers can compute any mathematical function. • This has been known for decades, and is one reason scientists have been optimistic about the potential of neural networks to model intelligent systems. • Another reason is the analogy between neural networks and biological brains, which have been a standard of intelligence we are still trying to achieve. • There is only one catch: How do we find the right weights? 40
- Matlab neural network toolbox
- Convolutional neural network ppt
- Xooutput
- Visualizing and understanding convolutional networks
- Vc dimension of neural networks
- Freed et al 2001 ib psychology
- Audio super resolution using neural networks
- Convolutional neural networks for visual recognition
- Leon gatys
- Efficient processing of deep neural networks
- Mippers
- Least mean square algorithm in neural network
- Pixel recurrent neural networks.
- Neural networks for rf and microwave design
- 11-747 neural networks for nlp
- Perceptron xor
- Csrmm
- On the computational efficiency of training neural networks
- Input layer
- Fuzzy logic lecture
- Convolutional neural networks
- Few shot learning with graph neural networks
- Deep forest towards an alternative to deep neural networks
- Convolutional neural networks
- Neuraltools neural networks
- Andrew ng recurrent neural networks
- Predicting nba games using neural networks
- Neural networks and learning machines
- The wake-sleep algorithm for unsupervised neural networks
- Bharath subramanyam
- Convolutional neural network alternatives
- Datagram vs virtual circuit
- Backbone networks in computer networks
- Weisfeiler-lehman neural machine for link prediction
- Visualizing and understanding neural machine translation
- Evolution of storage area network
- Wan switching
- Introduction to communication networks
- Introduction to wide area networks
- Introduction to switched networks
- Addition symbol