Skip to main content
  • Poster presentation
  • Open access
  • Published:

Deterministic neural networks as sources of uncorrelated noise for probabilistic computations

Neural-network models of brain function often rely on the presence of noise [14]. To date, the interplay of microscopic noise sources and network function is only poorly understood. In computer simulations and in neuromorphic hardware [57], the number of noise sources (random-number generators) is limited. In consequence, neurons in large functional network models have to share noise sources and are therefore correlated. In general, it is unclear how shared-noise correlations affect the performance of functional network models. Further, there is so far no solution to the problem of how a limited number of noise sources can supply a large number of functional units with uncorrelated noise.

Here, we investigate the performance of neural Boltzmann machines [24]. We show that correlations in the background activity are detrimental to the sampling performance and that the deviations from the target distribution scale inversely with the number of noise sources. Further, we show that this problem can be overcome by replacing the finite ensemble of independent noise sources by a recurrent neural network with the same number of units. As shown recently, inhibitory feedback, abundant in biological neural networks, serves as a powerful decorrelation mechanism [8, 9]: Shared-noise correlations are actively suppressed by the network dynamics. By exploiting this effect, the network performance is significantly improved. Hence, recurrent neural networks can serve as natural finite-size noise sources for functional neural networks, both in biological and in synthetic neuromorphic substrates. Finally we investigate the impact of sampling network parameters on its ability to faithfully represent a given well-defined distribution. We show that sampling networks with sufficiently strong negative feedback can intrinsically suppress correlations in the background activity, and thereby improve their performance substantially.

References

  1. Rolls ET, Deco G: The noisy brain. 2010, Oxford University Press

    Google Scholar 

  2. Hinton GE, Sejnowski TJ, Ackley DH: Boltzmann machines: constraint satisfaction networks that learn. Technical report, Carnegie-Mellon University. 1984

    Google Scholar 

  3. Buesing L, Bill J, Nessler B, Maass W: Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons. PloS CB. 2011, 7: e1002211-

    CAS  Google Scholar 

  4. Petrovici MA, Bill J, Bytschok I, Schemmel J, Meier K: Stochastic inference with deterministic spiking neurons. 2013, arXiv 1311.3211v1 [q-bio.NC]

    Google Scholar 

  5. Schemmel J, Bruederle D, Gruebl A, Hock M, Meier K, Millner S: AWafer-Scale Neuromorphic Hardware System for Large-Scale Neural Modeling. Proceedings of the 2010 International Symposium on Circuits and Systems (ISCAS), IEEE Press. 2010, 1947-1950.

    Chapter  Google Scholar 

  6. Bruederle D, Petrovici M, Vogginger B, Ehrlich M, Pfeil T, Millner S, Gruebl A, Wendt K, Mueller E, Schwartz MO et al: A comprehensive workflow for general-purpose neural modeling with highly configurable neuromorphic hardware systems. Biological Cybernetics. 2011, 104: 263-296.

    Article  Google Scholar 

  7. Petrovici MA, Vogginger B, Mueller P, Breitwieser O, Lundqvist M, Muller L, Ehrlich M, Destexhe A, Lansner A, Schueffny R, et al: Characterization and Compensation of Network-Level Anomalies in Mixed-Signal Neuromorphic Modeling Platforms. PLoS ONE. 2014, 9 (10): e108590-

    Article  PubMed  PubMed Central  Google Scholar 

  8. Renart A, De La Rocha J, Bartho P, Hollender L, Parga N, Reyes A, Harris KD: The asynchronous State in Cortical Circuits. Science. 2010, 327: 587-590.

    Article  PubMed  CAS  PubMed Central  Google Scholar 

  9. Tetzlaff T, Helias M, Einevoll G, Diesmann M: Decorrelation of neural-network activity by inhibitory feedback. PloS CB. 2012, 8: e1002596-

    CAS  Google Scholar 

Download references

Acknowledgements

Partially supported by the Helmholtz Association portfolio theme SMHB, the Jülich Aachen Research Alliance (JARA), EU Grant 269921 (BrainScaleS), The Austrian Science Fund FWF #I753-N23 (PNEUMA), The Manfred Stärk Foundation, and EU Grant 604102 (Human Brain Project, HBP).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jakob Jordan.

Rights and permissions

Open Access  This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.

The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

To view a copy of this licence, visit https://creativecommons.org/licenses/by/4.0/.

The Creative Commons Public Domain Dedication waiver (https://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jordan, J., Tetzlaff, T., Petrovici, M. et al. Deterministic neural networks as sources of uncorrelated noise for probabilistic computations. BMC Neurosci 16 (Suppl 1), P62 (2015). https://doi.org/10.1186/1471-2202-16-S1-P62

Download citation

  • Published:

  • DOI: https://doi.org/10.1186/1471-2202-16-S1-P62

Keywords