Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

This article is part of the supplement: Nineteenth Annual Computational Neuroscience Meeting: CNS*2010

Open Access Poster Presentation

Holistically convergent modular networks: a biological principle for recurrent network architecture

Matthew Cook1*, Luca Gugelmann2, Florian Jug2 and Christoph Krautz2

Author Affiliations

1 Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland

2 Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland

For all author emails, please log on.

BMC Neuroscience 2010, 11(Suppl 1):P38  doi:10.1186/1471-2202-11-S1-P38

The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1471-2202/11/S1/P38


Published:20 July 2010

© 2010 Cook et al; licensee BioMed Central Ltd.

Poster Presentation

The general principle of combining information processing modules in a feed-forward manner, to sequentially process input until the desired output has been generated, is well understood and widely used. In contrast, biological neural circuits often consist of recurrently connected modules, with each module’s units encoding a different aspect of the modeled world [1,3]. No engineering principles for constructing circuits in this biological pattern are available.

We explore this biological approach to circuit synthesis by creating a network of interacting visual maps whose goal is to find a scene interpretation consistent with the input. We construct the network by specifying the modules and their relationships as shown in Figure 1. The network is not hierarchical, and the relationships bi-directional, creating recurrent loops. The input we use is the time derivative (not the magnitude) of the light intensity at each pixel, as provided by an adaptive neuromorphic sensor [2].

thumbnailFigure 1. (A) Network architecture. R = camera rotation (global 3D vector), F = optic flow (2D vector map), V = pixel positions projected onto sphere at focal point (constant 3D vector map), E = time derivative of light intensity (scalar map), G = spatial gradient of intensity (2D vector map), I = light intensity (scalar map). Each of these is a map of values covering the visual space, except for R, which is a single global value. Note that our sensory input is E, not I. (B) Example of I, G, E, and F while the camera is rotating around its viewing axis. The circular legend indicates directional colors in G and F.

Since different consistency constraints are located in different parts of the network, information must propagate from the input to the ends of the network and back repeatedly as the network converges to a globally consistent state, a process we refer to as holistic convergence. Because the formula involving the input E leaves F and G underconstrained, one cannot directly infer R and I from it. Only by combining local constraints from all parts of the network can a globally coherent interpretation be obtained.

This work demonstrates how a biologically inspired recurrent architecture of modules and local relationships can be designed to interpret noisy or ambiguous data even when there is not a clear feed-forward method to generate the desired interpretation. This provides a new paradigm for the design of information processing systems, based on principles found in biological systems.

Acknowledgements

The authors would like to thank ETH Research Grant ETH-23 08-1 and EU Project Grant FET-IP-216593.

References

  1. Felleman D, Van Essen DC: Distributed Hierarchical Processing in the Primate Cerebral Cortex.

    Cereb Cortex 1991, 1:1-47. PubMed Abstract | Publisher Full Text OpenURL

  2. Lichtsteiner P, Posch C, Delbruck T: A 128 X 128 120db 30mw asynchronous vision sensor that responds to relative intensity change.

    Solid-St Circ Conf 2006, 2060-2069. OpenURL

  3. Ringbauer S, Bayerl P, Neumann H: Neural Mechanisms for Mid-Level Optical Flow Pattern Detection.

    Lect Notes Comput Sc 2007, 4669:281-290. OpenURL