Efficient Classification for Multiclass Problems Using Modular Neural Networks

Rangachari Anand, Kishan G Mehrotra, Chilukuri K. Mohan, Sanjay Ranka

Research output: Contribution to journalArticlepeer-review

332 Scopus citations


The rate of convergence of net output error is very low when training feedforward neural networks for multiclass problems using the back-propagation algorithm. While backpropagation will reduce the Euclidean distance between the actual and desired output vectors, the differences between some of the components of these vectors increase in the first iteration. Furthermore, the magnitudes of subsequent weight changes in each iteration are very small, so that many iterations are required to compensate for the increased error in some components in the initial iterations. Our approach is to use a modular network architecture, reducing a k-class problem to a set of K two-class problems, with a separately trained network for each of the simpler problems. Speedups of one order of magnitude have been obtained experimentally, and in some cases convergence was possible using the modular approach but not using a nonmodular network.

Original languageEnglish (US)
Pages (from-to)117-124
Number of pages8
JournalIEEE Transactions on Neural Networks
Issue number1
StatePublished - Jan 1995

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'Efficient Classification for Multiclass Problems Using Modular Neural Networks'. Together they form a unique fingerprint.

Cite this