10 WidrowHoff Learning LMS Algorithm 1 ADALINE Network

  • Slides: 26
Download presentation
10 Widrow-Hoff Learning (LMS Algorithm) 1

10 Widrow-Hoff Learning (LMS Algorithm) 1

ADALINE Network w i, 1 iw = w i, 2 ¼ 10 w i,

ADALINE Network w i, 1 iw = w i, 2 ¼ 10 w i, R 2

10 Two-Input ADALINE 3

10 Two-Input ADALINE 3

Mean Square Error 10 Training Set: Input: Target: Notation: Mean Square Error: 4

Mean Square Error 10 Training Set: Input: Target: Notation: Mean Square Error: 4

10 Error Analysis The mean square error for the ADALINE Network is a quadratic

10 Error Analysis The mean square error for the ADALINE Network is a quadratic function: 5

10 Stationary Point Hessian Matrix: The correlation matrix R must be at least positive

10 Stationary Point Hessian Matrix: The correlation matrix R must be at least positive semidefinite. If there any zero eigenvalues, the performance index will either have a weak minumum or else no stationary point, otherwise there will be a unique global minimum x*. If R is positive definite: 6

10 Approximate Steepest Descent Approximate mean square error (one sample): Approximate (stochastic) gradient: 7

10 Approximate Steepest Descent Approximate mean square error (one sample): Approximate (stochastic) gradient: 7

10 Approximate Gradient Calculation 8

10 Approximate Gradient Calculation 8

10 LMS Algorithm 9

10 LMS Algorithm 9

10 Multiple-Neuron Case Matrix Form: 10

10 Multiple-Neuron Case Matrix Form: 10

10 Analysis of Convergence For stability, the eigenvalues of this matrix must fall inside

10 Analysis of Convergence For stability, the eigenvalues of this matrix must fall inside the unit circle. 11

10 Conditions for Stability (where li is an eigenvalue of R) Since , .

10 Conditions for Stability (where li is an eigenvalue of R) Since , . Therefore the stability condition simplifies to 1 – 2 al i > – 1 12

10 Steady State Response If the system is stable, then a steady state condition

10 Steady State Response If the system is stable, then a steady state condition will be reached. The solution to this equation is This is also the strong minimum of the performance index. 13

10 Banana Example Apple 14

10 Banana Example Apple 14

10 Iteration One Banana 15

10 Iteration One Banana 15

10 Iteration Two Apple 16

10 Iteration Two Apple 16

10 Iteration Three 17

10 Iteration Three 17

10 Adaptive Filtering Tapped Delay Line Adaptive Filter 18

10 Adaptive Filtering Tapped Delay Line Adaptive Filter 18

10 Example: Noise Cancellation 19

10 Example: Noise Cancellation 19

10 Noise Cancellation Adaptive Filter 20

10 Noise Cancellation Adaptive Filter 20

10 Correlation Matrix 21

10 Correlation Matrix 21

Stationary Point 10 0 0 h = E [ ( s ( k )

Stationary Point 10 0 0 h = E [ ( s ( k ) + m ( k ) )v ( k ) ] E [ ( s ( k ) + m ( k ) )v ( k – 1 ) ] 23

10 Performance Index 24

10 Performance Index 24

10 LMS Response 25

10 LMS Response 25

10 Echo Cancellation 26

10 Echo Cancellation 26