Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

This article is part of the supplement: Eighteenth Annual Computational Neuroscience Meeting: CNS*2009

Open Access Poster presentation

Neural network model for sequence learning based on phase precession

Sean Byrnes12*, Anthony N Burkitt12, Hamish Meffin3 and David B Grayden12

Author Affiliations

1 The Bionic Ear Institute, Melbourne, Victoria, 3002, Australia

2 Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, 3010, Australia

3 NICTA, c/- Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, 3010, Australia

For all author emails, please log on.

BMC Neuroscience 2009, 10(Suppl 1):P259  doi:10.1186/1471-2202-10-S1-P259


The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1471-2202/10/S1/P259


Published:13 July 2009

© 2009 Byrnes et al; licensee BioMed Central Ltd.

Introduction

We have developed a network structure, inspired by experimental data on the neural correlates of navigation in the hippocampus [1], that is able to learn and recognize multiple overlapping sequences of symbols. The network relies on an analog of phase precession, the phenomenon in which place cells fire at progressively earlier phases with respect to a local field potential oscillation as the animal moves further into the place field. Following [1], we suppose phase precession serves to compress temporal sequences, ensuring both the appropriate timescale for synaptic plasticity and robustness to variation in symbol presentation rate. One model proposes that phase precession results from the interaction of rhythmic inhibition with excitatory input [2]. We investigate how different assumptions about the nature of the excitatory input and variation in other model parameters affect the robustness of phase precession and sequence learning.

Model

The network consists of pools of leaky integrate-and-fire neurons, with one such pool for each symbol. All pools receive identical periodic inhibition. When a symbol occurs it is represented by excitatory input to the corresponding pool. The input is initially weak, then increases, before ceasing. The periodic inhibition combines with the increasing input strength to result in phase precession and sequence compression: during each cycle of the inhibition, neurons corresponding to recent symbols fire in the correct order.

Sequence learning occurs because synapses between pools corresponding to consecutive symbols are strengthened by spike-timing dependent plasticity. Competitive heterosynaptic plasticity causes specialization of neurons to particular sequences. Neurons that receive strengthened connections from the previously active pool fire with reduced latency and recurrent inhibition then prevents the other neurons of the pool from firing at all.

Results

The rate, regularity and synaptic time courses of excitatory inputs are shown to affect the reliability and precision of phase precession. The ability of the network to learn and store multiple sequences is demonstrated and its sensitivity to variability in phase precession is characterized; in particular, the ability to learn infrequent sequence branches is shown to depend strongly on phase variability. Robustness of network behavior with respect to variation in other parameters is also described.

References

  1. Skaggs WE, McNaughton BL, Wilson BL, Barnes CA: Theta phase precession in hippocampal neuronal populations and the compression of temporal sequences.

    Hippocampus 1996, 6:149-172. PubMed Abstract | Publisher Full Text OpenURL

  2. Mehta MR, Lee AK, Wilson MA: Role of experience and oscillations in transforming a rate code into a temporal code.

    Nature 2002, 417:741-746. PubMed Abstract | Publisher Full Text OpenURL