Imperial College London

Dr Dan Goodman

Faculty of EngineeringDepartment of Electrical and Electronic Engineering

Senior Lecturer
 
 
 
//

Contact

 

+44 (0)20 7594 6264d.goodman Website

 
 
//

Location

 

1001Electrical EngineeringSouth Kensington Campus

//

Summary

 

Publications

Citation

BibTex format

@article{Perez-Nieves:2021,
author = {Perez-Nieves, N and Goodman, DFM},
journal = {Advances in Neural Information Processing Systems},
pages = {11795--11808},
title = {Sparse Spiking Gradient Descent},
volume = {15},
year = {2021}
}

RIS format (EndNote, RefMan)

TY  - JOUR
AB - There is an increasing interest in emulating Spiking Neural Networks (SNNs) on neuromorphic computing devices due to their low energy consumption. Recent advances have allowed training SNNs to a point where they start to compete with traditional Artificial Neural Networks (ANNs) in terms of accuracy, while at the same time being energy efficient when run on neuromorphic hardware. However, the process of training SNNs is still based on dense tensor operations originally developed for ANNs which do not leverage the spatiotemporally sparse nature of SNNs. We present here the first sparse SNN backpropagation algorithm which achieves the same or better accuracy as current state of the art methods while being significantly faster and more memory efficient. We show the effectiveness of our method on real datasets of varying complexity (Fashion-MNIST, NeuromophicMNIST and Spiking Heidelberg Digits) achieving a speedup in the backward pass of up to 150x, and 85% more memory efficient, without losing accuracy.
AU - Perez-Nieves,N
AU - Goodman,DFM
EP - 11808
PY - 2021///
SN - 1049-5258
SP - 11795
TI - Sparse Spiking Gradient Descent
T2 - Advances in Neural Information Processing Systems
VL - 15
ER -