Outline Announcement Neural networks Perceptrons continued Multilayer neural

  • Slides: 22
Download presentation
Outline • Announcement • Neural networks – Perceptrons - continued – Multi-layer neural networks

Outline • Announcement • Neural networks – Perceptrons - continued – Multi-layer neural networks • Back-propagation – Applications 11/23/2020 Visual Perception Modeling 1

Announcement • Talk by Julian Besag – Department of Statistics, Florida State University –

Announcement • Talk by Julian Besag – Department of Statistics, Florida State University – 3: 30 pm, Friday, March 9, 2001 – Room 001 OSB • Project presentation and report – Presentation and demo will be in Week 15 • Two of you will share a class – Report is due on 5: 00 pm, Wednesday, April 25, 2001 11/23/2020 Visual Perception Modeling 2

Mc. Culloch and Pitts Model wi 1 win 11/23/2020 Visual Perception Modeling 3

Mc. Culloch and Pitts Model wi 1 win 11/23/2020 Visual Perception Modeling 3

Activation Functions • Activation functions – O = g(x) 11/23/2020 Visual Perception Modeling 4

Activation Functions • Activation functions – O = g(x) 11/23/2020 Visual Perception Modeling 4

Issues of Neural Networks • Issues to be solved when using neural networks –

Issues of Neural Networks • Issues to be solved when using neural networks – What kind of architecture one should use? – How to determine the connection weights? • The main advantage of using neural networks is that there exist efficient learning algorithms which can determine the connection weights automatically for a large class of neural networks 11/23/2020 Visual Perception Modeling 5

Layered Feed-Forward Networks • Layered feed-forward networks were called perceptrons 11/23/2020 Visual Perception Modeling

Layered Feed-Forward Networks • Layered feed-forward networks were called perceptrons 11/23/2020 Visual Perception Modeling 6

Simple Perceptrons • Simple perceptrons – One-layer feed-forward network • There is an input

Simple Perceptrons • Simple perceptrons – One-layer feed-forward network • There is an input layer and an output layer and no hidden layers – The computation can be described by • Thresholds are omitted because they can always be treated as connections to an input terminal that is – 1 permanently 11/23/2020 Visual Perception Modeling 7

A Simple Learning Algorithm • There is a learning algorithm for a simple perceptron

A Simple Learning Algorithm • There is a learning algorithm for a simple perceptron network – Given a training pattern k , the desired output is i – The learning algorithm, or the procedure to change its weights, is 11/23/2020 Visual Perception Modeling 8

Perceptron Classification Demo • The feature space is the two-dimensional plane • We have

Perceptron Classification Demo • The feature space is the two-dimensional plane • We have three training examples – One from the black category – Two from the white category – The line represents the decision boundary • The network has two input neurons and one output 11/23/2020 Visual Perception Modeling 9

Simple Perceptrons – cont. • Convergence of the learning rule – One can prove

Simple Perceptrons – cont. • Convergence of the learning rule – One can prove mathematically that the learning rule will converge to a solution in case that the solution exists in finite learning steps 11/23/2020 Visual Perception Modeling 10

Simple Perceptrons – cont. • Linear separability – For simple perceptrons, the condition for

Simple Perceptrons – cont. • Linear separability – For simple perceptrons, the condition for correct operation is that a plane should divide the inputs that have positive and negative targets – This means the decision boundary will be a plane where • The plane is w • = 0 11/23/2020 Visual Perception Modeling 11

Simple Perceptrons – cont. • Linear units • Gradient descent learning 11/23/2020 Visual Perception

Simple Perceptrons – cont. • Linear units • Gradient descent learning 11/23/2020 Visual Perception Modeling 12

Simple Perceptrons – cont. • Limitations of linear feed-forward networks – A multi-layer linear

Simple Perceptrons – cont. • Limitations of linear feed-forward networks – A multi-layer linear feed-forward network is exactly equivalent to a one-layer one in the computation it performs • Linear transformations of a linear transformation is a linear transformation – Historically, this is a very important fact • All linear feed-forward networks cannot solve linearly non-separable problems – XOR problem 11/23/2020 Visual Perception Modeling 13

Multi-layer Perceptrons • The limitations of perceptrons do not apply to feed-forward networks with

Multi-layer Perceptrons • The limitations of perceptrons do not apply to feed-forward networks with hidden layers between the input and output layer with nonlinear activation function • The problem is to train the network efficiently 11/23/2020 Visual Perception Modeling 14

Multi-layer Perceptrons – cont. 11/23/2020 Visual Perception Modeling 15

Multi-layer Perceptrons – cont. 11/23/2020 Visual Perception Modeling 15

Multi-layer Perceptrons – cont. • Back-propagation – Extension of the gradient descent learning rule

Multi-layer Perceptrons – cont. • Back-propagation – Extension of the gradient descent learning rule – The hidden-to-output layer connections 11/23/2020 Visual Perception Modeling 16

Multi-layer Perceptrons – cont. • Back propagation - continued – Input-to-hidden connections where 11/23/2020

Multi-layer Perceptrons – cont. • Back propagation - continued – Input-to-hidden connections where 11/23/2020 Visual Perception Modeling 17

Activation Function • Activation function – For back-propagation, the activation function must be differentiable

Activation Function • Activation function – For back-propagation, the activation function must be differentiable – Also we want it to saturate at both extremes – Sigmoid function 11/23/2020 Visual Perception Modeling 18

Activation Function – cont. 11/23/2020 Visual Perception Modeling 19

Activation Function – cont. 11/23/2020 Visual Perception Modeling 19

Back Propagation Algorithm 1. 2. 3. 4. 5. Initialize the weights to small random

Back Propagation Algorithm 1. 2. 3. 4. 5. Initialize the weights to small random values Choose a pattern ku and apply it to the input layer Propagate the signal forward through the network Compute the deltas (errors) for the output layer Compute the deltas (errors) for the preceding layers by propagating the errors backwards 6. Update all the connections according to the algorithm 7. Go back to step 2 and repeat for the next pattern 11/23/2020 Visual Perception Modeling 20

Using Neural Networks • Design phase – The neural network architecture • Training phase

Using Neural Networks • Design phase – The neural network architecture • Training phase – Use available examples to train the neural network • That is, to use the back-propagation algorithm to learn the connection weights • Test phase – For a new sample, feed the feature through the neural network and you go the result 11/23/2020 Visual Perception Modeling 21

Applications • Application examples – NETtalk – Navigation of a car – Image compression

Applications • Application examples – NETtalk – Navigation of a car – Image compression – Recognizing hand-written ZIP codes – Speech recognition – Face recognition 11/23/2020 Visual Perception Modeling 22