Scalable NoC-based Neuromorphic Hardware Learning and Inference

Haowen Fang, Amar Shrestha, Ma De Ma, Qinru Qiu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

Bio-inspired neuromorphic hardware is a research direction to approach brain's computational power and energy efficiency. Spiking neural networks (SNN) encode information as sparsely distributed spike trains and employ spike-timingdependent plasticity (STDP) mechanism for learning. Existing hardware implementations of SNN are limited in scale or do not have in-hardware learning capability. In this work, we propose a low-cost scalable Network-on-Chip (NoC) based SNN hardware architecture with fully distributed in-hardware STDP learning capability. All hardware neurons work in parallel and communicate through the NoC. This enables chip-level interconnection, scalability and reconfigurability necessary for deploying different applications. The hardware is applied to learn MNIST digits as an evaluation of its learning capability. We explore the design space to study the trade-offs between speed, area and energy. How to use this procedure to find optimal architecture configuration is also discussed.

Original languageEnglish (US)
Title of host publication2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Volume2018-July
ISBN (Electronic)9781509060146
DOIs
StatePublished - Oct 10 2018
Event2018 International Joint Conference on Neural Networks, IJCNN 2018 - Rio de Janeiro, Brazil
Duration: Jul 8 2018Jul 13 2018

Other

Other2018 International Joint Conference on Neural Networks, IJCNN 2018
CountryBrazil
CityRio de Janeiro
Period7/8/187/13/18

Fingerprint

Hardware
Neural networks
Plasticity
Network-on-chip
Neurons
Energy efficiency
Scalability
Brain
Costs

Keywords

  • Network on chip
  • Spiking neural network
  • STDP learning
  • Unsupervised learning

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Fang, H., Shrestha, A., De Ma, M., & Qiu, Q. (2018). Scalable NoC-based Neuromorphic Hardware Learning and Inference. In 2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings (Vol. 2018-July). [8489619] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2018.8489619

Scalable NoC-based Neuromorphic Hardware Learning and Inference. / Fang, Haowen; Shrestha, Amar; De Ma, Ma; Qiu, Qinru.

2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings. Vol. 2018-July Institute of Electrical and Electronics Engineers Inc., 2018. 8489619.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Fang, H, Shrestha, A, De Ma, M & Qiu, Q 2018, Scalable NoC-based Neuromorphic Hardware Learning and Inference. in 2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings. vol. 2018-July, 8489619, Institute of Electrical and Electronics Engineers Inc., 2018 International Joint Conference on Neural Networks, IJCNN 2018, Rio de Janeiro, Brazil, 7/8/18. https://doi.org/10.1109/IJCNN.2018.8489619
Fang H, Shrestha A, De Ma M, Qiu Q. Scalable NoC-based Neuromorphic Hardware Learning and Inference. In 2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings. Vol. 2018-July. Institute of Electrical and Electronics Engineers Inc. 2018. 8489619 https://doi.org/10.1109/IJCNN.2018.8489619
Fang, Haowen ; Shrestha, Amar ; De Ma, Ma ; Qiu, Qinru. / Scalable NoC-based Neuromorphic Hardware Learning and Inference. 2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings. Vol. 2018-July Institute of Electrical and Electronics Engineers Inc., 2018.
@inproceedings{8bb182e04bd142c18e3123ddebde90e5,
title = "Scalable NoC-based Neuromorphic Hardware Learning and Inference",
abstract = "Bio-inspired neuromorphic hardware is a research direction to approach brain's computational power and energy efficiency. Spiking neural networks (SNN) encode information as sparsely distributed spike trains and employ spike-timingdependent plasticity (STDP) mechanism for learning. Existing hardware implementations of SNN are limited in scale or do not have in-hardware learning capability. In this work, we propose a low-cost scalable Network-on-Chip (NoC) based SNN hardware architecture with fully distributed in-hardware STDP learning capability. All hardware neurons work in parallel and communicate through the NoC. This enables chip-level interconnection, scalability and reconfigurability necessary for deploying different applications. The hardware is applied to learn MNIST digits as an evaluation of its learning capability. We explore the design space to study the trade-offs between speed, area and energy. How to use this procedure to find optimal architecture configuration is also discussed.",
keywords = "Network on chip, Spiking neural network, STDP learning, Unsupervised learning",
author = "Haowen Fang and Amar Shrestha and {De Ma}, Ma and Qinru Qiu",
year = "2018",
month = "10",
day = "10",
doi = "10.1109/IJCNN.2018.8489619",
language = "English (US)",
volume = "2018-July",
booktitle = "2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Scalable NoC-based Neuromorphic Hardware Learning and Inference

AU - Fang, Haowen

AU - Shrestha, Amar

AU - De Ma, Ma

AU - Qiu, Qinru

PY - 2018/10/10

Y1 - 2018/10/10

N2 - Bio-inspired neuromorphic hardware is a research direction to approach brain's computational power and energy efficiency. Spiking neural networks (SNN) encode information as sparsely distributed spike trains and employ spike-timingdependent plasticity (STDP) mechanism for learning. Existing hardware implementations of SNN are limited in scale or do not have in-hardware learning capability. In this work, we propose a low-cost scalable Network-on-Chip (NoC) based SNN hardware architecture with fully distributed in-hardware STDP learning capability. All hardware neurons work in parallel and communicate through the NoC. This enables chip-level interconnection, scalability and reconfigurability necessary for deploying different applications. The hardware is applied to learn MNIST digits as an evaluation of its learning capability. We explore the design space to study the trade-offs between speed, area and energy. How to use this procedure to find optimal architecture configuration is also discussed.

AB - Bio-inspired neuromorphic hardware is a research direction to approach brain's computational power and energy efficiency. Spiking neural networks (SNN) encode information as sparsely distributed spike trains and employ spike-timingdependent plasticity (STDP) mechanism for learning. Existing hardware implementations of SNN are limited in scale or do not have in-hardware learning capability. In this work, we propose a low-cost scalable Network-on-Chip (NoC) based SNN hardware architecture with fully distributed in-hardware STDP learning capability. All hardware neurons work in parallel and communicate through the NoC. This enables chip-level interconnection, scalability and reconfigurability necessary for deploying different applications. The hardware is applied to learn MNIST digits as an evaluation of its learning capability. We explore the design space to study the trade-offs between speed, area and energy. How to use this procedure to find optimal architecture configuration is also discussed.

KW - Network on chip

KW - Spiking neural network

KW - STDP learning

KW - Unsupervised learning

UR - http://www.scopus.com/inward/record.url?scp=85056549438&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85056549438&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2018.8489619

DO - 10.1109/IJCNN.2018.8489619

M3 - Conference contribution

AN - SCOPUS:85056549438

VL - 2018-July

BT - 2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -