A programmable neural virtual machine based on a fast store-erase learning rule

Garrett Katz, Gregory P. Davis, Rodolphe J. Gentili, James A. Reggia

Research output: Contribution to journalArticle

Abstract

We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.

Original languageEnglish (US)
Pages (from-to)10-30
Number of pages21
JournalNeural Networks
Volume119
DOIs
StatePublished - Nov 1 2019

Fingerprint

Learning
Computer Systems
Computer architecture
Language
Plasticity
Neural networks
Virtual machine
Experiments

Keywords

  • Itinerant attractor dynamics
  • Local learning
  • Multiplicative gating
  • Programmable neural networks
  • Symbolic processing

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this

A programmable neural virtual machine based on a fast store-erase learning rule. / Katz, Garrett; Davis, Gregory P.; Gentili, Rodolphe J.; Reggia, James A.

In: Neural Networks, Vol. 119, 01.11.2019, p. 10-30.

Research output: Contribution to journalArticle

Katz, Garrett ; Davis, Gregory P. ; Gentili, Rodolphe J. ; Reggia, James A. / A programmable neural virtual machine based on a fast store-erase learning rule. In: Neural Networks. 2019 ; Vol. 119. pp. 10-30.
@article{8471e13d47df4f01b52a030fb2d6c802,
title = "A programmable neural virtual machine based on a fast store-erase learning rule",
abstract = "We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.",
keywords = "Itinerant attractor dynamics, Local learning, Multiplicative gating, Programmable neural networks, Symbolic processing",
author = "Garrett Katz and Davis, {Gregory P.} and Gentili, {Rodolphe J.} and Reggia, {James A.}",
year = "2019",
month = "11",
day = "1",
doi = "10.1016/j.neunet.2019.07.017",
language = "English (US)",
volume = "119",
pages = "10--30",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier",

}

TY - JOUR

T1 - A programmable neural virtual machine based on a fast store-erase learning rule

AU - Katz, Garrett

AU - Davis, Gregory P.

AU - Gentili, Rodolphe J.

AU - Reggia, James A.

PY - 2019/11/1

Y1 - 2019/11/1

N2 - We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.

AB - We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.

KW - Itinerant attractor dynamics

KW - Local learning

KW - Multiplicative gating

KW - Programmable neural networks

KW - Symbolic processing

UR - http://www.scopus.com/inward/record.url?scp=85069933034&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069933034&partnerID=8YFLogxK

U2 - 10.1016/j.neunet.2019.07.017

DO - 10.1016/j.neunet.2019.07.017

M3 - Article

VL - 119

SP - 10

EP - 30

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

ER -