Low-latency sensory processing is a key requirement of collision avoidance systems for automated driving. Event-based cameras have been proposed and investigated as a new type of sensor for faster and more efficient collision detection. In this study, we investigate an insect vision-inspired network that detects looming objects in traffic situations using event-based camera data. As such, it represents a lightweight alternative to large neural networks applied in vision-based driving assistance systems. Using simulated driving accident scenarios, we find that our system can reliably detect colliding vehicles up to 1.3 s before the collision. Furthermore, we demonstrate the effectiveness of nonlinear radial motion opponency filtering in addressing the challenges of optical flow-based looming detection.
Funding
ActiveAI - active learning and selective attention for robust, transparent and efficient AI : EPSRC-ENGINEERING & PHYSICAL SCIENCES RESEARCH COUNCIL | EP/S030964/1
Unlocking spiking neural networks for machine learning research : EPSRC-ENGINEERING & PHYSICAL SCIENCES RESEARCH COUNCIL | EP/V052241/1