This paper addresses the relationship between the number of hidden layer nodes in a neural network, the complexity of a multiclass discrimination problem, and the number of samples needed for effective learning. Bounds are given for the latter. We show that Ω(min (d, n). M) boundary samples are required for successful classification of M clusters of samples using a two-hidden-layer neural network with d-dimensional inputs and n nodes in the first hidden layer.
ASJC Scopus subject areas
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence