File(s) not publicly available
Linked local navigation for visual route guidance
journal contribution
posted on 2023-06-07, 22:02 authored by Lincoln Smith, Andy PhilippidesAndy Philippides, Paul GrahamPaul Graham, Bart Baddeley, Phil HusbandsPhil HusbandsInsects are able to navigate reliably between food and nest using only visual information. This behavior has inspired many models of visual landmark guidance, some of which have been tested on autonomous robots. The majority of these models work by comparing the agent's current view with a view of the world stored when the agent was at the goal. The region from which agents can successfully reach home is therefore limited to the goal's visual locale, that is, the area around the goal where the visual scene is not radically different to the goal position. Ants are known to navigate over large distances using visually guided routes consisting of a series of visual memories. Taking inspiration from such route navigation, we propose a framework for linking together local navigation methods. We implement this framework on a robotic platform and test it in a series of environments in which local navigation methods fail. Finally, we show that the framework is robust to environments of varying complexity.
History
Publication status
- Published
Journal
Adaptive BehaviorISSN
1059-7123Publisher
SAGE PublicationsExternal DOI
Issue
3Volume
15Page range
257-271Department affiliated with
- Informatics Publications
Full text available
- No
Peer reviewed?
- Yes