Dataset for paper accepted in IOP Neuromorphic Computing and Engineering March 2022 Dataset contains trained weights from TensorFlow 2.4.0 for the following models: - vgg16_imagenet_tf_weights.h5 - VGG-16 model trained on ImageNet ILSVRC dataset - vgg16_tf_weights.h5 - VGG-16 model trained on CIFAR-10 dataset - resnet20_cifar10_tf_weights.h5 - ResNet-20 model trained on CIFAR-10 dataset - resnet34_imagenet_tf_weights.h5 - ResNet-34 model trained on ImageNet ILSVRC Abstract "In this paper we present mlGeNN - a Python library for the conversion of artificial neural networks (ANNs) specified in Keras to spiking neural networks (SNNs). SNNs are simulated using GeNN with extensions to efficiently support convolutional connectivity and batching. We evaluate converted SNNs on CIFAR-10 and ImageNet classification tasks and compare the performance to both the original ANNs and other SNN simulators. We find that performing inference using a VGG-16 model, trained on the CIFAR-10 dataset, is 2.5x faster than BindsNet and, when using a ResNet-20 model trained on CIFAR-10 with FewSpike ANN to SNN conversion, mlGeNN is only a little over 2x slower than TensorFlow." Funding Brains on Board grant number EP/P006094/1 ActiveAI grant number EP/S030964/1 Unlocking spiking neural networks for machine learning research grant number EP/V052241/1 European Union's Horizon 2020 research and innovation program under Grant Agreement 945539