A programmable neural virtual machine based on a fast store-erase learning rule

Garrett E. Katz, Gregory P. Davis, Rodolphe J. Gentili, James A. Reggia

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.

Original languageEnglish (US)
Pages (from-to)10-30
Number of pages21
JournalNeural Networks
Volume119
DOIs
StatePublished - Nov 2019
Externally publishedYes

Keywords

  • Itinerant attractor dynamics
  • Local learning
  • Multiplicative gating
  • Programmable neural networks
  • Symbolic processing

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A programmable neural virtual machine based on a fast store-erase learning rule'. Together they form a unique fingerprint.

Cite this