Information-theoretic analysis of neural data: why do it, why it is challenging, and what can be learned

old_uid8781
titleInformation-theoretic analysis of neural data: why do it, why it is challenging, and what can be learned
start_date2010/05/27
schedule12h
onlineno
summaryEntropy and information are of interest to neuroscientists, because of their mathematical properties and because they place limits on the performance of a neural system. Estimating these quantities from neural spike trains is much more challenging than estimating other statistics, such as mean and variance. The central difficulty in estimating information is tightly linked to the properties of information that make it a desirable quantity to estimate. To surmount this fundamental difficulty, most approaches to estimation of information rely (perhaps implicitly) on a model for how spike trains are related. The nature of these model assumptions vary widely. As a result, information estimates are useful not only in situations in which several approaches provide mutually consistent results, but also in situations in which they differ. These ideas are illustrated with examples from the visual and gustatory systems.
responsiblesTchang