TY - GEN
T1 - Approximating back-propagation for a biologically plausible local learning rule in spiking neural networks
AU - Shrestha, Amar
AU - Fang, Haowen
AU - Wu, Qing
AU - Qiu, Qinru
N1 - Publisher Copyright:
© 2019 Association for Computing Machinery.
PY - 2019/7/23
Y1 - 2019/7/23
N2 - Asynchronous event-driven computation and communication using spikes facilitate the realization of spiking neural networks (SNN) to be massively parallel, extremely energy efficient and highly robust on specialized neuromorphic hardware. However, the lack of a unified robust learning algorithm limits the SNN to shallow networks with low accuracies. Artificial neural networks (ANN), however, have the backpropagation algorithm which can utilize gradient descent to train networks which are locally robust universal function approximators. But backpropagation algorithm is neither biologically plausible nor neuromorphic implementation friendly because it requires: 1) separate backward and forward passes, 2) differentiable neurons, 3) high-precision propagated errors, 4) coherent copy of weight matrices at feedforward weights and the backward pass, and 5) non-local weight update. Thus, we propose an approximation of the backpropagation algorithm completely with spiking neurons and extend it to a local weight update rule which resembles a biologically plausible learning rule spike-timing-dependent plasticity (STDP). This will enable error propagation through spiking neurons for a more biologically plausible and neuromorphic implementation friendly backpropagation algorithm for SNNs. We test the proposed algorithm on various traditional and non-traditional benchmarks with competitive results.
AB - Asynchronous event-driven computation and communication using spikes facilitate the realization of spiking neural networks (SNN) to be massively parallel, extremely energy efficient and highly robust on specialized neuromorphic hardware. However, the lack of a unified robust learning algorithm limits the SNN to shallow networks with low accuracies. Artificial neural networks (ANN), however, have the backpropagation algorithm which can utilize gradient descent to train networks which are locally robust universal function approximators. But backpropagation algorithm is neither biologically plausible nor neuromorphic implementation friendly because it requires: 1) separate backward and forward passes, 2) differentiable neurons, 3) high-precision propagated errors, 4) coherent copy of weight matrices at feedforward weights and the backward pass, and 5) non-local weight update. Thus, we propose an approximation of the backpropagation algorithm completely with spiking neurons and extend it to a local weight update rule which resembles a biologically plausible learning rule spike-timing-dependent plasticity (STDP). This will enable error propagation through spiking neurons for a more biologically plausible and neuromorphic implementation friendly backpropagation algorithm for SNNs. We test the proposed algorithm on various traditional and non-traditional benchmarks with competitive results.
KW - Backpropagation
KW - Local Learning
KW - Neuromorphic
KW - Spike-Timing Dependent Plasticity
KW - Spiking Neural Networks
UR - http://www.scopus.com/inward/record.url?scp=85073199496&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85073199496&partnerID=8YFLogxK
U2 - 10.1145/3354265.3354275
DO - 10.1145/3354265.3354275
M3 - Conference contribution
AN - SCOPUS:85073199496
T3 - ACM International Conference Proceeding Series
BT - ICONS 2019 - Proceedings of International Conference on Neuromorphic Systems
PB - Association for Computing Machinery
T2 - 2019 International Conference on Neuromorphic Systems, ICONS 2019
Y2 - 23 July 2019 through 25 July 2019
ER -