Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

This article is part of the supplement: Sixteenth Annual Computational Neuroscience Meeting: CNS*2007

Open Access Poster presentation

Do high order cross-correlations among neurons convey useful information?

Riccardo Senatore* and Stefano Panzeri

Author Affiliations

Faculty of Life Sciences, The University of Manchester, The Mill, PO Box 88, Manchester M60 1QD, UK

For all author emails, please log on.

BMC Neuroscience 2007, 8(Suppl 2):P155  doi:10.1186/1471-2202-8-S2-P155

The electronic version of this article is the complete one and can be found online at:


Published:6 July 2007

© 2007 Senatore and Panzeri; licensee BioMed Central Ltd.

Poster presentation

Populations of neurons in the brain encode information about the external environment as time-cell series of spikes. A fundamental question in systems neuroscience is how this information can be decoded by a downstream neural system. Since the responses of different neurons are statistically correlated [1], it is possible that such correlations convey important information and thus need to be taken into account by any decoding algorithm. Although coding by correlations may increase the capacity of a neural population to encode information, it may greatly complicate its decoding. In fact, it is possible that all neurons within the population interact with each other, and that their interaction cannot be described only in terms of "pair-wise" or low order interactions between neurons, but it reflects a genuine higher order interaction among a larger population. In such case, the number of parameters describing such correlations would increase exponentially with the population size drastically increasing the complexity of the codes we want to investigate. On the other hand, it is also possible that a downstream system can access all the information available in the population activity even when taking into account only low-order correlations among neurons. In this way, the brain could exploit some of the representational capacity offered by correlation codes, but at the same time limit the complexity needed to decode it.

Conceptualizing neurons as communications channels, we can quantify how much Shannon's mutual information, I, is available to a decoder that is observing the neural responses and who knows the true stimulus-response probabilities. We can also compare it with a lower bound, Ik, of how much information could be decoded by a decoder that assumes some simpler structure of correlation taking into account only statistical correlations between neurons up to the k-th order [2,3]. A principled way to construct such models from experimental data is to build the maximum-entropy response probability among those with the same marginal probabilities (and thus correlations) up to order k as the real population responses. We quantify the importance of higher order correlations as the decoding cost (I-Ik) of neglecting higher order correlation.

We demonstrate that, by using appropriate bias correction statistical techniques [4], this lower bound can be made data-robust and computed with the limited number of trials typically recorded in neurophysiology experiments. With 200 trials per stimulus, we could compute the contribution of information conveyed by all higher order correlations for a group of up to 10 neurons. Taken together, these results suggest that the application of the method proposed here will unravel the role of high order correlations among neurons in sensory coding, thus giving an insight into the complexity of the coding.

Acknowledgements

Thanks to M. A. Montemurro, for useful discussions. Supported by Pfizer Global Development RTD and the Royal Society.

References

  1. Averbeck BB, Latham PE, Pouget A: Neural correlations, population coding and computation.

    Nature Rev Neurosci 2006, 7:358-366. Publisher Full Text OpenURL

  2. Latham PE, Nirenberg S: Synergy, redundancy, and independence in population codes, revisited.

    J Neurosci 2005, 25:5195-5206. PubMed Abstract | Publisher Full Text OpenURL

  3. Amari SH: Information geometry on hierarchy of probability distributions.

    IEEE Trans Inform Theory 2001, 47:1701-1711. Publisher Full Text OpenURL

  4. Montemurro MA, Senatore R, Panzeri S: Tight data-robust bounds to mutual information combining shuffling and model selection techniques.

    Neural Computation 2007, in press. OpenURL