We have developed a network structure, inspired by experimental data on the neural correlates of navigation in the hippocampus , that is able to learn and recognize multiple overlapping sequences of symbols. The network relies on an analog of phase precession, the phenomenon in which place cells fire at progressively earlier phases with respect to a local field potential oscillation as the animal moves further into the place field. Following , we suppose phase precession serves to compress temporal sequences, ensuring both the appropriate timescale for synaptic plasticity and robustness to variation in symbol presentation rate. One model proposes that phase precession results from the interaction of rhythmic inhibition with excitatory input . We investigate how different assumptions about the nature of the excitatory input and variation in other model parameters affect the robustness of phase precession and sequence learning.
The network consists of pools of leaky integrate-and-fire neurons, with one such pool for each symbol. All pools receive identical periodic inhibition. When a symbol occurs it is represented by excitatory input to the corresponding pool. The input is initially weak, then increases, before ceasing. The periodic inhibition combines with the increasing input strength to result in phase precession and sequence compression: during each cycle of the inhibition, neurons corresponding to recent symbols fire in the correct order.
Sequence learning occurs because synapses between pools corresponding to consecutive symbols are strengthened by spike-timing dependent plasticity. Competitive heterosynaptic plasticity causes specialization of neurons to particular sequences. Neurons that receive strengthened connections from the previously active pool fire with reduced latency and recurrent inhibition then prevents the other neurons of the pool from firing at all.
The rate, regularity and synaptic time courses of excitatory inputs are shown to affect the reliability and precision of phase precession. The ability of the network to learn and store multiple sequences is demonstrated and its sensitivity to variability in phase precession is characterized; in particular, the ability to learn infrequent sequence branches is shown to depend strongly on phase variability. Robustness of network behavior with respect to variation in other parameters is also described.