Bounds on the Number of Samples Needed for Neural Learning

Kishan G. Mehrotra, Chilukuri K. Mohan, Sanjay Ranka

Research output: Contribution to journalArticlepeer-review

66 Scopus citations

Abstract

This paper addresses the relationship between the number of hidden layer nodes in a neural network, the complexity of a multiclass discrimination problem, and the number of samples needed for effective learning. Bounds are given for the latter. We show that Ω(min (d, n). M) boundary samples are required for successful classification of M clusters of samples using a two-hidden-layer neural network with d-dimensional inputs and n nodes in the first hidden layer.

Original languageEnglish (US)
Pages (from-to)548-558
Number of pages11
JournalIEEE Transactions on Neural Networks
Volume2
Issue number6
DOIs
StatePublished - Nov 1991

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Bounds on the Number of Samples Needed for Neural Learning'. Together they form a unique fingerprint.

Cite this