Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

This article is part of the supplement: Twenty First Annual Computational Neuroscience Meeting: CNS*2012

Open Access Oral presentation

A neural network based holistic model of ant route navigation

Bart Baddeley1, Paul Graham2, Philip Husbands1 and Andrew Philippides1*

Author Affiliations

1 Centre for Computational Neuroscience and Robotics, Department of Informatics, University of Sussex, Brighton, BN1 9QG, UK

2 Centre for Computational Neuroscience and Robotics, School of Life Sciences, University of Sussex, Brighton, BN1 9QJ, UK

For all author emails, please log on.

BMC Neuroscience 2012, 13(Suppl 1):O1  doi:10.1186/1471-2202-13-S1-O1


The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1471-2202/13/S1/O1


Published:16 July 2012

© 2012 Baddeley et al; licensee BioMed Central Ltd.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Oral presentation

The impressive ability of social insects to learn long foraging routes guided by visual information [1] provides proof that robust spatial behaviour can be produced with limited neural resources [2,3]. As such, social insects have become an important model system for understanding the minimal cognitive requirements for navigation [1]. This is a goal shared by biomimetic engineers and those studying animal cognition using a bottom-up approach to the understanding of natural intelligence [4]. Models of visual navigation that have been successful in replicating place homing are dominated by snapshot-type models where a single view of the world as memorized from the goal location is compared to the current view in order to drive a search for the goal [5], for review, see [6]. Snapshot approaches only allow for navigation in the immediate vicinity of the goal however, and do not achieve robust route navigation over longer distances [7].

Here we present a parsimonious model of visually guided route learning that addresses this issue [8]. We test this proposed route navigation strategy in simulation, by learning a series of routes through visually cluttered environments consisting of objects that are only distinguishable as silhouettes against the sky. Our navigation algorithm consists of two phases. The ant first traverses the route using a combination of path integration and obstacle avoidance during which the views used to learn the route are experienced. Subsequently, the ant navigates by visually scanning the world – a behaviour observed in ants in the field – and moving in the direction which is deemed most familiar. As proof of concept, we first determine view familiarity by exhaustive comparison with the set of views experienced during training. In subsequent experiments we train an artificial neural network to perform familiarity discrimination using the training views via the InfoMax algorithm [9].

By utilising the interaction of sensori-motor constraints and observed innate behaviours we show that it is possible to produce robust behaviour using a learnt holistic representation of a route. Furthermore, we show that the model captures the known properties of route navigation in desert ants. These include the ability to learn a route after a single training run and the ability to learn multiple idiosyncratic routes to a single goal. Importantly, navigation is independent of odometric or compass information, does not specify when or what to learn nor separate the routes into sequences of waypoints, so providing proof of concept that route navigation can be achieved without these elements. The algorithm also exhibits both place-search and route navigation with the same mechanism. As such, we believe the model represents the only detailed and complete model of insect route guidance to date.

References

  1. Wehner R: The architecture of the desert ant’s navigational toolkit (Hymenoptera: Formicidae).

    Myrmecol News 2009, 12:85-96. OpenURL

  2. Wehner R: Desert ant navigation: How miniature brains solve complex tasks. Karl von Frisch lecture.

    J Comp Physiol A 2003, 189:579-588. Publisher Full Text OpenURL

  3. Chittka L, Skorupski P: Information processing in miniature brains.

    Proc R Soc Lond B 2011, 278:885-888. Publisher Full Text OpenURL

  4. Shettleworth SJ: Cognition, Evolution, and Behavior. 2nd edition. New York: Oxford University Press; 2010.

  5. Cartwright BA, Collett TS: Landmark learning in bees.

    J Comp Physiol A 1983, 151:521543. OpenURL

  6. Möller R, Vardy A: Local visual homing by matched-filter descent in image distances.

    Biol Cybern 2006, 95:413-430. PubMed Abstract | Publisher Full Text OpenURL

  7. Smith L, Philippides A, Graham P, Baddeley B, Husbands P: Linked local navigation for visual route guidance.

    Adapt Behav 2007, 15:257-271. Publisher Full Text OpenURL

  8. Baddeley B, Graham P, Husbands P, Philippides A: A Model of Ant Route Navigation Driven by Scene Familiarity.

    PLoS Comput Biol 2012, 8(1):e1002336. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  9. Lulham A, Bogacz R, Vogt S, Brown MW: An infomax algorithm can perform both familiarity discrimination and feature extraction in a single network.

    Neural Comput 2011, 23:909-926. PubMed Abstract | Publisher Full Text OpenURL