Efficient outdoor navigation remains a challenge for autonomous robots, yet bees excel in robust long-range navigation with minimal computational resources. To do so, they scaffold learning through innate behaviours such as survey flights: loops centred on the nest to explore the environment, which they perform before foraging. While the 2D positions of these flights have been tracked by radar, it has not been tested how well these flights can support subsequent long-range visual homing, nor whether the 3D structure (not captured by the radar) has an effect on homing performance. Using a 6km2 3D LIDAR scan of the Rothamsted Research Center – where bumblebee flights were tracked in radar experiments – we recreate the trajectory of bumblebee exploration and foraging flights. We then render panoramic views of the visited coordinates, and use these to test the efficacy of visual homing over large distances, and flying altitudes ranging from 2 to 32 m above the ground. We find that our model can predict the direction of the target from up to 300 m. Additionally, homing improves at higher altitudes, but there is limited transferability of information between flying heights.
Funding
ActiveAI - active learning and selective attention for robust, transparent and efficient AI : EPSRC-ENGINEERING & PHYSICAL SCIENCES RESEARCH COUNCIL | EP/S030964/1
Unlocking spiking neural networks for machine learning research : EPSRC-ENGINEERING & PHYSICAL SCIENCES RESEARCH COUNCIL | EP/V052241/1
Emergent embodied cognition in shallow, biological and artificial, neural networks : BBSRC-BIOTECHNOLOGY & BIOLOGICAL SCIENCES RESEARCH COUNCIL | BB/X01343X/1