Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

This article is part of the supplement: Twentieth Annual Computational Neuroscience Meeting: CNS*2011

Open Access Poster presentation

Motion-based predictive coding is sufficient to solve the aperture problem

Mina A Khoei*, Laurent U Perrinet and Guillaume S Masson

Author Affiliations

Institut de Neurosciences Cognitives de la Méditerranée, CNRS, Université de la Méditerranée , 13402 Marseille Cedex 20, France

For all author emails, please log on.

BMC Neuroscience 2011, 12(Suppl 1):P279  doi:10.1186/1471-2202-12-S1-P279


The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1471-2202/12/S1/P279


Published:18 July 2011

© 2011 Khoei et al; licensee BioMed Central Ltd.

This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Poster presentation

It is still unclear how information collected locally by low-level sensory neurons may give rise to a coherent global percept. This is well demonstrated in the aperture problem both in visual or haptic senses. Experimental findings on its biological solution in area MT show that local motion measures are integrated to see the dynamical emergence of global motion information [1]. We develop a theory of spatio-temporal integration defined as implementing motion-based predictive coding. This takes the form of an anisotropic, context-dependent diffusion of local information [2]. Here, we test this functional model for the aperture problem in the visual and haptic low-level sensory areas.

In our model, spatial and motion information is represented in a probabilistic framework. Information is pooled using a Markov chain formulation, merging current information and measurement likelihood thanks to a prior on motion transition. This prior is defined so that it is adapted to smooth trajectories such as are observed in natural environments.This dynamical system favors temporally coherent features. Differently to neural approximations [3], we use a particle filtering method to implement this functional model. This generalizes Kalman filtering approaches that were used previously by allowing to represent non-gaussian and multimodal distributions.

We observe the emergence of mechanisms that reflect observations made at psychophysical and behavioral levels. First, the dynamical system shows the emergence of the solution to the aperture problem and show dependence to line’s length [4]. Then,when presented with an object with a regular translation, the dynamical system grabs itsmotion independently of its shape and exhibits motion extrapolation. This shows that prediction is sufficient for the dynamical build-up of information from a local to a global scale. More generally it may give insights in the role of spatio-temporal integration on neural dynamics in the emergence of properties that are accredited to low-level sensory computations.

References

  1. Smith MA, Majaj N, Movshon JA: Dynamics of pattern motion computation. In Dynamics of Visual Motion Processing: Neuronal, Behavioral and Computational Approaches. Edited by G. S. Masson and U. J. Ilg. Springer; 2010:55-72. OpenURL

  2. Watamaniuk S, McKee S, Grzywacz N: Detecting a trajectory embedded in random-direction motion noise.

    Vision research 1995, 35(1):65-77. PubMed Abstract | Publisher Full Text OpenURL

  3. Burgi PY, Yuille AL, Grzywacz N: Probabilistic motion estimation based on temporal coherence. Neural Computation.

    2000, 12(8):1839-1867. PubMed Abstract OpenURL

  4. Castet E, Lorenceau J, Bonnet C: The inverse intensity effect is not lost with stimuli in apparent motion.

    Vision Research 1993, 33(12):1697-1708. PubMed Abstract | Publisher Full Text OpenURL