Dataset for research paper: Exploring the robustness of insect-inspired visual navigation for flying robots

Data for paper published in ALIFE 2020: The 2020 Conference on Artificial Life on July 14, 2020.

This dataset comprises the image data used for the ALife 2020 paper, both recorded from the robot gantry system and UAV.

Firstly, there is the 3D database of panoramic images recorded with the University of Sussex's robot gantry system. Plastic foliage and flowers were present in the gantry arena to provide naturalistic visual stimuli. Both the original raw images and the processed (i.e. unwrapped) images are included. The coordinates at which each image was recorded are listed in the included CSV files and metadata for the databases are included in YAML and .mat (MATLAB) formats.

The second dataset comprises the image data used in the UAV section of the paper, both in raw form and after processing. Further detail on this dataset is included in the ZIP file.

Abstract:
Having previously developed and tested insect-inspired visual navigation algorithms for ground-based agents, we here investigate their robustness when applied to agents moving in three dimensions, to assess if they are applicable to both flying insects and robots, focusing on the impact and potential utility of changes in height. We first demonstrate that a robot implementing a route navigation algorithm can successfully navigate a route through an indoor environment at a variety of heights, even using images saved at different heights. We show that that in our environments, the efficacy of route navigation is increased with increasing height and also, for those environments, that there is better transfer of information when using images learnt at a high height to navigate when flying lower, than the other way around. This suggests that there is perhaps an adaptive value to the storing and use of views from increased height. To assess the limits to this result, we show that it is possible for a ground-based robot to recover the correct heading when using goal images stored from the perspective of a quadcopter. Through the robustness of this bio-inspired algorithm, we thus demonstrate the benefits of the ALife approach.