System design for in-hardware STDP learning and spiking based probablistic inference

Khadeer Ahmed, Amar Shrestha, Yanzhi Wang, Qinru Qiu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

The emerging field of neuromorphic computing is offering a possible pathway for approaching the brain's computing performance and energy efficiency for cognitive applications such as pattern recognition, speech understanding, natural language processing etc. In spiking neural networks (SNNs), information is encoded as sparsely distributed spike trains, enabling learning through the spike-timing dependent plasticity (STDP) mechanism. SNNs can potentially achieve ultra-low power consumption and distributed learning due to the inherent asynchronous and sparse inter-neuron communications. Several inroads have been made in SNN implementations, however, there is still a lack of computational models that lead to hardware implementation of large scale SNN with STDP capabilities. In this work, we present a set of neuron models and neuron circuit motifs that form SNNs capable of in-hardware fully-distributed STDP learning and spiking based probabilistic inference. Functions such as efficient Bayesian inference and unsupervised Hebbian learning are demonstrated on the proposed SNN system design. A highly scalable and flexible digital hardware implementation of the neuron model is also presented. Experimental results on two different applications: unsupervised feature extraction and inference based sentence construction, have demonstrated the proposed design's effectiveness in learning and inference.

Original languageEnglish (US)
Title of host publicationProceedings - IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016
PublisherIEEE Computer Society
Pages272-277
Number of pages6
Volume2016-September
ISBN (Electronic)9781467390385
DOIs
StatePublished - Sep 2 2016
Event15th IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016 - Pittsburgh, United States
Duration: Jul 11 2016Jul 13 2016

Other

Other15th IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016
CountryUnited States
CityPittsburgh
Period7/11/167/13/16

Fingerprint

Plasticity
Systems analysis
Neural networks
Hardware
Neurons
Unsupervised learning
Pattern recognition
Energy efficiency
Feature extraction
Brain
Electric power utilization
Networks (circuits)
Communication
Processing

Keywords

  • Bayesian inference
  • Bayesian neuron
  • digital neuron
  • Spiking neural network
  • STDP Learning
  • unsupervised feature learning
  • winner-take-all

ASJC Scopus subject areas

  • Hardware and Architecture
  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Cite this

Ahmed, K., Shrestha, A., Wang, Y., & Qiu, Q. (2016). System design for in-hardware STDP learning and spiking based probablistic inference. In Proceedings - IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016 (Vol. 2016-September, pp. 272-277). [7560209] IEEE Computer Society. https://doi.org/10.1109/ISVLSI.2016.91

System design for in-hardware STDP learning and spiking based probablistic inference. / Ahmed, Khadeer; Shrestha, Amar; Wang, Yanzhi; Qiu, Qinru.

Proceedings - IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016. Vol. 2016-September IEEE Computer Society, 2016. p. 272-277 7560209.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ahmed, K, Shrestha, A, Wang, Y & Qiu, Q 2016, System design for in-hardware STDP learning and spiking based probablistic inference. in Proceedings - IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016. vol. 2016-September, 7560209, IEEE Computer Society, pp. 272-277, 15th IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016, Pittsburgh, United States, 7/11/16. https://doi.org/10.1109/ISVLSI.2016.91
Ahmed K, Shrestha A, Wang Y, Qiu Q. System design for in-hardware STDP learning and spiking based probablistic inference. In Proceedings - IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016. Vol. 2016-September. IEEE Computer Society. 2016. p. 272-277. 7560209 https://doi.org/10.1109/ISVLSI.2016.91
Ahmed, Khadeer ; Shrestha, Amar ; Wang, Yanzhi ; Qiu, Qinru. / System design for in-hardware STDP learning and spiking based probablistic inference. Proceedings - IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016. Vol. 2016-September IEEE Computer Society, 2016. pp. 272-277
@inproceedings{494a12ab735b42d58eb46f1e5e3a185c,
title = "System design for in-hardware STDP learning and spiking based probablistic inference",
abstract = "The emerging field of neuromorphic computing is offering a possible pathway for approaching the brain's computing performance and energy efficiency for cognitive applications such as pattern recognition, speech understanding, natural language processing etc. In spiking neural networks (SNNs), information is encoded as sparsely distributed spike trains, enabling learning through the spike-timing dependent plasticity (STDP) mechanism. SNNs can potentially achieve ultra-low power consumption and distributed learning due to the inherent asynchronous and sparse inter-neuron communications. Several inroads have been made in SNN implementations, however, there is still a lack of computational models that lead to hardware implementation of large scale SNN with STDP capabilities. In this work, we present a set of neuron models and neuron circuit motifs that form SNNs capable of in-hardware fully-distributed STDP learning and spiking based probabilistic inference. Functions such as efficient Bayesian inference and unsupervised Hebbian learning are demonstrated on the proposed SNN system design. A highly scalable and flexible digital hardware implementation of the neuron model is also presented. Experimental results on two different applications: unsupervised feature extraction and inference based sentence construction, have demonstrated the proposed design's effectiveness in learning and inference.",
keywords = "Bayesian inference, Bayesian neuron, digital neuron, Spiking neural network, STDP Learning, unsupervised feature learning, winner-take-all",
author = "Khadeer Ahmed and Amar Shrestha and Yanzhi Wang and Qinru Qiu",
year = "2016",
month = "9",
day = "2",
doi = "10.1109/ISVLSI.2016.91",
language = "English (US)",
volume = "2016-September",
pages = "272--277",
booktitle = "Proceedings - IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016",
publisher = "IEEE Computer Society",
address = "United States",

}

TY - GEN

T1 - System design for in-hardware STDP learning and spiking based probablistic inference

AU - Ahmed, Khadeer

AU - Shrestha, Amar

AU - Wang, Yanzhi

AU - Qiu, Qinru

PY - 2016/9/2

Y1 - 2016/9/2

N2 - The emerging field of neuromorphic computing is offering a possible pathway for approaching the brain's computing performance and energy efficiency for cognitive applications such as pattern recognition, speech understanding, natural language processing etc. In spiking neural networks (SNNs), information is encoded as sparsely distributed spike trains, enabling learning through the spike-timing dependent plasticity (STDP) mechanism. SNNs can potentially achieve ultra-low power consumption and distributed learning due to the inherent asynchronous and sparse inter-neuron communications. Several inroads have been made in SNN implementations, however, there is still a lack of computational models that lead to hardware implementation of large scale SNN with STDP capabilities. In this work, we present a set of neuron models and neuron circuit motifs that form SNNs capable of in-hardware fully-distributed STDP learning and spiking based probabilistic inference. Functions such as efficient Bayesian inference and unsupervised Hebbian learning are demonstrated on the proposed SNN system design. A highly scalable and flexible digital hardware implementation of the neuron model is also presented. Experimental results on two different applications: unsupervised feature extraction and inference based sentence construction, have demonstrated the proposed design's effectiveness in learning and inference.

AB - The emerging field of neuromorphic computing is offering a possible pathway for approaching the brain's computing performance and energy efficiency for cognitive applications such as pattern recognition, speech understanding, natural language processing etc. In spiking neural networks (SNNs), information is encoded as sparsely distributed spike trains, enabling learning through the spike-timing dependent plasticity (STDP) mechanism. SNNs can potentially achieve ultra-low power consumption and distributed learning due to the inherent asynchronous and sparse inter-neuron communications. Several inroads have been made in SNN implementations, however, there is still a lack of computational models that lead to hardware implementation of large scale SNN with STDP capabilities. In this work, we present a set of neuron models and neuron circuit motifs that form SNNs capable of in-hardware fully-distributed STDP learning and spiking based probabilistic inference. Functions such as efficient Bayesian inference and unsupervised Hebbian learning are demonstrated on the proposed SNN system design. A highly scalable and flexible digital hardware implementation of the neuron model is also presented. Experimental results on two different applications: unsupervised feature extraction and inference based sentence construction, have demonstrated the proposed design's effectiveness in learning and inference.

KW - Bayesian inference

KW - Bayesian neuron

KW - digital neuron

KW - Spiking neural network

KW - STDP Learning

KW - unsupervised feature learning

KW - winner-take-all

UR - http://www.scopus.com/inward/record.url?scp=84988953258&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988953258&partnerID=8YFLogxK

U2 - 10.1109/ISVLSI.2016.91

DO - 10.1109/ISVLSI.2016.91

M3 - Conference contribution

AN - SCOPUS:84988953258

VL - 2016-September

SP - 272

EP - 277

BT - Proceedings - IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2016

PB - IEEE Computer Society

ER -