Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

This article is part of the supplement: Sixteenth Annual Computational Neuroscience Meeting: CNS*2007

Open Access Poster presentation

Analysis of coupled decision-making modules for multisensory integration

Marco Loh*, Ralph G Andrzejak and Gustavo Deco

Author affiliations

Department of Technology, Universitat Pompeu Fabra, Barcelona, Spain

For all author emails, please log on.

Citation and License

BMC Neuroscience 2007, 8(Suppl 2):P19  doi:10.1186/1471-2202-8-S2-P19


The electronic version of this article is the complete one and can be found online at:


Published:6 July 2007

© 2007 Loh et al; licensee BioMed Central Ltd.

Poster presentation

We were interested in how two coupled decision-making modules behave. This is for example interesting in multisensory integration, in which both auditory and visual precepts have to be integrated into one common percept. We used a biophysically realistic neural model consisting of integrate-and-fire neurons with detailed synaptic channels. We studied the influence of the strength of the cross-connection between the two decision-making modules on the performance of the model. The performance was assessed by how often the system can correctly extract the stimulus even though just weak input was applied. We found that the optimal performance of the coupled modules is achieved by a certain cross-connection strength, which is independent of the amplitude of the stimulus input. This means that once an optimal cross-connection has been learned it can be used for all types of stimulus inputs to achieve an optimal performance. We also present the mechanism, which is responsible for this improvement: Inconsistent constellations in the two modules converge to the correct response. We could also simulate the law of inverse effectiveness in our model. We related the strength of the input bias to the multisensory integration index and found an inversely correlated relationship such as observed in experimental data [1]. We also investigated a three-module architecture, in which two primary sensory areas like the auditory and the visual one are connected by a third, integrating module. The 3rd module could correspond to higher processing areas such as the STS, which mediates between the two primary sensory areas. We show that there are similar dynamics as in the two-module case, although the necessary coupling strength between the modules is increased. The coupling is more indirect than in the two-module case. We conclude that decision-making modules can be coupled to increase performance compared to single decision-making module.

Acknowledgements

The study has been supported by the Volkswagen Foundation.

References

  1. Ghazanfar AA, Maier JX, Hoffman KL, Logothetis NK: Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex.

    J Neurosci 2005, 25(20):5004-5012. PubMed Abstract | Publisher Full Text OpenURL