BiologicallyInspired Neural Nets Modeling the Hippocampus Hippocampus 101
Biologically-Inspired Neural Nets Modeling the Hippocampus
Hippocampus 101 • In 1957, Scoville and Milner reported on patient HM • Since then, numerous studies have used f. MRI and PET scans to demonstrate use of hippocampus during learning and recall • Numerous rat studies that monitor individual neurons demonstrate the existence of place cells • Generally, hippocampus is associated with intermediate term memory (ITM).
Hippocampus 101 • In 1994, Wilson and Mc. Naughton demonstrated that sharp wave bursts (SPW) during sleep are time-compressed sequences learned earlier • Levy hypothesizes that the hippocampus teaches learned sequences to the neocortex as part of a biased random processes • Levy also hypothesizes that erasure/bias demotion happens when the neocortex signals to the hippocampus that the sequence was acquired, probably during slow-wave sleep (SWS).
Cornus Ammon • The most significant feature in the hippocampus is the Cornus Ammon (CA) • Most work in the Levy Lab focuses specifically on the CA 3 region, although recently we’ve started re-examining the CA 1 region as well
Minimal Model CA 3 recurrent activity
Typical Equations Definitions yj net excitation of j xj external input to j zj output state of j θ threshold to fire KI feedforward inhibition KR feedback inhibition K 0 resting conductance cij connectivity from i to j wij weight between i and j ε rate constant of synaptic modification α spike decay rate t time
Fundamental Properties • Neurons are Mc. Culloch-Pitts-type threshold elements • Synapses modify associatively on a local Hebbian-type rule • Most connections are excitatory • Recurrent excitation is sparse, asymmetric, and randomly connected • Inhibitory neurons approximately control net activity • In CA 3, recurrent excitation contributes more to activity than external excitation • Activity is low, but not too low
Model Variables Functional 1. Average activity 2. Activity fluctuations 3. Sequence length memory capacity 4. Average lifetime of local context neurons 5. Speed of learning 6. Ratio of external to recurrent excitations Actual 1. Number of neurons 2. Percent connectivity 3. Time span of synaptic associations 4. Threshold to fire 5. Feedback inhibition weight constant 6. Feedforward inhibition weight constant 7. Resting conductance 8. Rate constant of synaptic modification 9. Input code
Eleven Problems 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. Simple sequence completion Spontaneous rebroadcast One-trial learning Jump-ahead recall Sequence disambiguation (context past) Finding a shortcut Goal finding (context future) Combining appropriate subsequences Transverse patterning Transitive inference Trace conditioning
Sequence Completion • Train on sequence ABCDEFG • Provide input A • Network recalls BCDEFG
Rebroadcast • Train network on one or more sequences • Provide random input patterns • All or part of one of the trained sequences is recalled
One-trial learning • Requires high synaptic modification • Does not use same parameters as other problems • Models short-term memory (STM) instead of intermediate-term memory (ITMhippocampus)
Jump-ahead recall • With adjusted inhibition, sequence completion can be short-circuited • Train network on ABCDEFG • Provide A • Network recalls G or possibly BDG, etc. • Inhibition in hippocampus does vary
Disambiguation • Train network on patterns ABC 456 GHI and abc 456 ghi • Present pattern A to the network • Network recalls BC 456 GHI • Requires patterns 4, 5, and 6 to be coded differently depending on past context
Shortcuts • Train network on pattern ABC 456 GHIJKL 456 PQR • Present pattern A to the network • Network recalls BC 456 PQR • Uses common neurons of patterns 4, 5, and 6 to generate a shortcut
Goal Finding • Train network on pattern ABC 456 GHIJKL 456 PQR • Present pattern A and part of pattern K to the network • Network recalls BC 456 GHIJK… • Requires use of context future
Combinations • Train network on patterns ABC 456 GHI and abc 456 ghi • Present pattern A and part of pattern i to the network • Network recalls BC 456 ghi • Also requires use of context future
Transverse Patterning • Similar to rock, paper, scissors • Train network on sequences [AB]a+, [AB]b -, [BC]b+, [BC]c-, [AC]c+, [AC]a • Present [AB] and part of + to network and network will generate a • Present [BC] and part of + to network and network will generate b • Present [AC] and part of + to network and network will generate c
Transitive Inference • Transitivity: if A>B and B>C, then A>C • Train network on [AB]a+, [AB]b-, [BC]b+, [BC]c-, [CD]c+, [CD]d-, [DE]d+, [DE]e • Present [BD] and part of + to network, and it will generate b
Trace Conditioning • Train network on sequence A……B • Vary the amount of time between presentation of pattern A and pattern B • Computational results match experimental results on trace conditioning in rabbits
Important Recent Discoveries • Addition of random “starting pattern” improves performance of network • Synaptic failures improve performance (and reduce energy requirements) • Addition of CA 1 decoder improves performance
- Slides: 21