The backpropagation algorithm converges very slowly for two class problems in which most of the exemplars belong to one dominant class. We analyze that this occurs because the computed net error gradient vector is dominated by the bigger class so much that the net error for the exemplars in the smaller class increases significantly in the initial iteration. The subsequent rate of convergence of the net error is very low. We present a modified technique for calculating a direction in weight space which decreases the error for each class. Using this algorithm, we have been able to accelerate the rate of learning for two class classification problems by an order of magnitude.
ASJC Scopus subject areas
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence