Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

This article is part of the supplement: Seventeenth Annual Computational Neuroscience Meeting: CNS*2008

Open Access Oral presentation

An online Hebbian learning rule that performs independent component analysis

Claudia Clopath1*, André Longtin2 and Wulfram Gerstner1

Author affiliations

1 LCN, EPFL, Lausanne, 1015, Switzerland

2 CND, University of Ottawa, Ottawa, Ontario, Canada

For all author emails, please log on.

Citation and License

BMC Neuroscience 2008, 9(Suppl 1):O13  doi:10.1186/1471-2202-9-S1-O13


The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1471-2202/9/S1/O13


Published:11 July 2008

© 2008 Clopath et al; licensee BioMed Central Ltd.

Oral presentation

The so-called cocktail party problem refers to a situation where several sound sources are simultaneously active, e.g. persons talking at the same time. The goal is to recover the initial sound sources from the measurement of the mixed signals. A standard method of solving the cocktail party problem is independent component analysis (ICA), which can be performed by a class of powerful algorithms. However, classical algorithms based on higher moments of the signal distribution [1] do not consider temporal correlations, i.e. data points corresponding to different time slices could be shuffled without a change in the results. But time order is important since most natural signal sources have intrinsic temporal correlations that could potentially be exploited. Therefore, some algorithms have been developed to take into account those temporal correlations, e.g. algorithms based on delayed correlations [2,3] potentially combined with higher-order statistics [4], based on innovation processes [5], or complexity pursuit [6]. However, those methods are rather algorithmic and most of them are difficult to interpret biologically, e.g. they are not online or not local or require a preprocessing of the data.

Biological learning algorithms are usually implemented as an online Hebbian learning rule that triggers changes of synaptic efficacy based on the correlations between pre- and postsynaptic neurons. A Hebbian learning rule, like Oja's learning rule [7], combined with a linear neuron model, has been shown to perform principal component analysis (PCA). Simply using a nonlinear neuron combined with Oja's learning rule allows one to compute higher moments of the distributions which yields ICA if the signals have been preprocessed (whitening) at an earlier stage [1]. Here, we are interested in exploiting the correlation of the signals at different time delays, i.e. a generalization of the theory of Molgedey and Schuster [3]. We will show that a linear neuron model combined with a Hebbian learning rule based on the joint firing rates of the pre- and postsynaptic neurons of different time delays performs ICA by exploiting the temporal correlations of the presynaptic inputs (Figure 1).

thumbnailFigure 1. The sources s are mixed with a matrix C, x = Cs, x are the presynaptic signals. Using a linear neuron y = W x, the weights W are updated following the Hebbian rule, so that the postsynaptic signals y recover the sources s.

References

  1. Hyvaerinen A, Karhunen J, Oja E: Independent Component Analysis. Wiley-Interscience; 2001.

  2. Ziehe A, Muller K: TDSEP – an efficient algorithm for blind separation using time structure.

    Proc Int Conf on Art Neur Net 1998. OpenURL

  3. Molgedey L, Schuster H: Separation of a mixture of independent signals using time delayed correlations.

    Phys. Rev. Lett. 1994, 72:3634-3637. PubMed Abstract | Publisher Full Text OpenURL

  4. Mueller K, Philips P, Ziehe A: JADE TD: Combining higher-order statistics and temporal information for blind source separation (with noise).

    Proc Int Workshop on ICA 1999. OpenURL

  5. Hyvaerinen A: Independent component analysis for time-dependent stochastic processes.

    Proc Int Conf on Art Neur Net 1998. OpenURL

  6. Hyvaerinen A: Complexity pursuit: Separating interesting components from time-series.

    Neural Comput 2001, 13:883-898. PubMed Abstract | Publisher Full Text OpenURL

  7. Oja E: A simplified neuron model as principal component analyzer.

    J Math Biology 1982, 15:267-273. Publisher Full Text OpenURL