Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

This article is part of the supplement: Eighteenth Annual Computational Neuroscience Meeting: CNS*2009

Open Access Poster presentation

Reservoir computing methods for functional identification of biological networks

Tayfun Gürelu1*, Stefan Rotter13 and Ulrich Egert12

Author Affiliations

1 Bernstein Center for Computational Neuroscience Freiburg, Germany

2 Biomicrotechnology, Department of Microsystems Engineering – IMTEK, University of Freiburg, Germany

3 Computational Neuroscience, Faculty of Biology, University of Freiburg, Germany

For all author emails, please log on.

BMC Neuroscience 2009, 10(Suppl 1):P293  doi:10.1186/1471-2202-10-S1-P293


The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1471-2202/10/S1/P293


Published:13 July 2009

© 2009 Gürelu et al; licensee BioMed Central Ltd.

Poster presentation

The complexity of biological neural networks (BNN) necessitates automated methods for investigating their stimulus-response and structure-dynamics relations. In the present work, we aim at building a functionally equivalent network to a reference BNN. The response signal of the BNN to various input streams is regarded as a characterization of its function. Therefore, we train an artificial system that imitates the input-output relation of the reference BNN under the applied stimulus range. In other words, we take a system identification approach for biological neural networks. Generic network models with fixed random connectivity, recurrent dynamics and fading memory, reservoirs, were shown to have a strong separation property on various input streams. Equipped with additional simple readout units, such systems have been successfully applied to several nonlinear modeling and engineering tasks [1].

Here we take a reservoir computing approach for functional identification of simulated random BNNs and neuronal cell cultures [2]. More specifically, we utilize an Echo State Network (ESN) of leaky integrator (non-spiking) neurons with sigmoid activation functions to identify a BNN. We propose algorithms to adapt the ESN parameters for modeling the relations between continuous input streams and multi-unit recordings in BNNs. Our findings indicate that the trained ESNs can imitate the response signal of a reference biological network for several tasks. For instance, we trained an ESN to estimate the instantaneous firing rate (conditional intensity) of a randomly selected neuron in a simulated BNN. Receiver Operating Characteristic (ROC) curve analysis showed that the ESN can estimate the conditional intensity of this selected neuron (see Figure 1).

thumbnailFigure 1. Estimated conditional intensity for a selected biological neural network. Conditional intensity estimations, λ, for all time steps in the testing period are shown in decreasing order (top). A bar is shown if there was indeed a spike observed in the corresponding time step (top). Distributions of conditional intensity for time steps with observed spikes and without spikes (middle). By a varying threshold on λ, true positive rates vs. false positive rates can be calculated (bottom).

Acknowledgements

This work was supported by the German BMBF (BCCN Freiburg, 01GQ0420) and the European Community (NEURO no 12788).

References

  1. Jaeger H: The "echo state" approach to analysing and training recurrent neural networks. In GMD Report 148. GMD – German National Research Institute for Computer Science; 2001. OpenURL

  2. Marom S, Shahaf G: Development, learning and memory in large random networks of cortical neurons: Lessons beyond anatomy.

    Q Rev Biophys 2002, 35:63-87. PubMed Abstract OpenURL