Entropy and Information Theory Tahereh Toosi IPM Recap

  • Slides: 19
Download presentation
Entropy and Information Theory Tahereh Toosi IPM

Entropy and Information Theory Tahereh Toosi IPM

Recap 2 [Churchland Abbott, 2012]

Recap 2 [Churchland Abbott, 2012]

Outline • Some general problems in Neuroscience • Information Theory • IT in Neuroscience

Outline • Some general problems in Neuroscience • Information Theory • IT in Neuroscience • Entropy • Mutual Information • How to use Mutual Information? • Summary 3

Some general problems • Coding scheme : • What is being encoded? • How

Some general problems • Coding scheme : • What is being encoded? • How is it being encoded? • With what precision? • With what limitations? • Multi trial or single trial? • Goodness of our neural decoding model 4 [Borst and Theunissen, 1999]

Stimulus space Distribution of stimuli P[S|{ti}] given spike train Stimulus parameter S 1 Stimulus

Stimulus space Distribution of stimuli P[S|{ti}] given spike train Stimulus parameter S 1 Stimulus parameter S 2 Estimate of stimulus given spike train accessible region for all stimuli in P[S] 5 [Spikes : Exploring the neural code]

Statistical significance of how neural responses vary with different stimuli 6 [Borst and Theunissen,

Statistical significance of how neural responses vary with different stimuli 6 [Borst and Theunissen, 1999]

Shannon helps… 7 [Borst and Theunissen, 1999]

Shannon helps… 7 [Borst and Theunissen, 1999]

Information Theory information is reported in units of ‘bits’, with : Ø The minus

Information Theory information is reported in units of ‘bits’, with : Ø The minus sign makes h a decreasing function of its argument as required. Ø Note that information is really a dimensionless number. 8 [Abbott and Dayan, 2001]

Encoding numbers in a digital code! one hand can carry log 2(6) ≈22. 58

Encoding numbers in a digital code! one hand can carry log 2(6) ≈22. 58 bits 9 [Spikes : Exploring the neural code]

Channel capacity of a neuron 10 [Borst and Theunissen, 1999]

Channel capacity of a neuron 10 [Borst and Theunissen, 1999]

Information loss 11

Information loss 11

Entropy quantifies the surprise or unpredictability associated with a particular response Shannon’s entropy is

Entropy quantifies the surprise or unpredictability associated with a particular response Shannon’s entropy is just this measure averaged over all responses: H =−(1− P[r+]) log 2(1− P[r+])− P[r+] log 2 P[r+] 12 The entropy of a binary code [Abbott and Dayan, 2001]

Mutual Information idea • To convey information about a set of stimuli, neural responses

Mutual Information idea • To convey information about a set of stimuli, neural responses must be different for different stimuli. • Entropy is a measure of response variability • Mutual Information idea : comparing the responses obtained using a different stimulus on every trial with those measured in trials involving repeated presentations of the same stimulus • The mutual information is the difference between the total response entropy and the average response entropy on trials that involve repetitive presentation of the same stimulus 13

Mutual Information The entropy of the responses to a given stimulus : The noise

Mutual Information The entropy of the responses to a given stimulus : The noise entropy: This is the entropy associated with that part of the response variability that is not due to changes in the stimulus, but arises from other sources. Mutual Information 14

How to use Information Theory: 1. 2. 3. 4. 5. Show your system stimuli.

How to use Information Theory: 1. 2. 3. 4. 5. Show your system stimuli. Measure neural responses. Estimate: P( neural response | stimulus presented ) From that, Estimate: P( neural repsones ) Compute: H(neural response) and H(neural response | stimulus presented) 6. Calculate: I(response ; stimulus) 15

Entropy : spike count Stimuli responses spike count 11001010 4 01000110 3 00110111 5

Entropy : spike count Stimuli responses spike count 11001010 4 01000110 3 00110111 5 11000000 2 16

How to screw it up: • Choose stimuli which are not representative. • Measure

How to screw it up: • Choose stimuli which are not representative. • Measure the “wrong” aspect of the response. • Don’t take enough data to estimate P( ) well. • Use a crappy method of computing H( ). • Calculate I( ) and report it without comparing it to anything. . . 17 [Math for Neuroscience, Stanford]

Summary • Strength of IT in Neuroscience: • Coding efficiency : Comparing the overall

Summary • Strength of IT in Neuroscience: • Coding efficiency : Comparing the overall information transfer to maximum spike train entropy • Calculating the absolute amount of information transmitted • Identifying system nonlinearities and validating any nonlinear system • Neural code precision : The measure of Mutual Information • Goodness of our neuronal decoding model : comparison of upper/lower estimate to direct estimate 18

THANK YOU! 19

THANK YOU! 19