Familiarity-taxis: a bilateral approach to view-based snapshot navigation
Many insects use view-based navigation, or snapshot matching, to return to familiar locations, or navigate routes. This relies on egocentric memories being matched to current views of the world. Previous Snapshot navigation algorithms have used full panoramic vision for the comparison of memorised images with query images to establish a measure of familiarity, which leads to a recovery of the original heading direction from when the snapshot was taken. Many aspects of insect sensory systems are lateralised with steering being derived from the comparison of left and right signals like a classic Braitenberg vehicle. Here, we investigate whether view-based route navigation can be implemented using bilateral visual familiarity comparisons. We found that the difference in familiarity between estimates from left and right fields of view can be used as a steering signal to recover the original heading direction. This finding extends across many different sizes of field of view and visual resolutions. In insects, steering computations are implemented in a brain region called the Lateral Accessory Lobe, within the Central Complex. In a simple simulation, we show with an SNN model of the LAL an existence proof of how bilateral visual familiarity could drive a search for a visually defined goal.
Funding
ActiveAI - active learning and selective attention for robust, transparent and efficient AI : EPSRC-ENGINEERING & PHYSICAL SCIENCES RESEARCH COUNCIL
Brains on Board: Neuromorphic Control of Flying Robots : EPSRC-ENGINEERING & PHYSICAL SCIENCES RESEARCH COUNCIL | EP/P006094/1
History
Publication status
- Published
File Version
- Published version
Journal
Adaptive BehaviorISSN
1059-7123Publisher
SAGE PublicationsPublisher URL
External DOI
Department affiliated with
- Informatics Publications
Institution
University of SussexFull text available
- Yes
Peer reviewed?
- Yes