Proofs which show backpropagation to produce outputs equal to the posteriors of the training data have left open the effect of reduced resources on the accuracy of estimation. The authors empirically explore the effects of reduced resources on the ability of networks to estimate the posterior likelihoods of data in two simple classification problems, one with independent and one with dependent cues. They contrast the effects of restricting hidden units and training cycles for classifying the different cues. Marginal probabilities tend to be incorrectly estimated, and dependencies among the cues affect both the course and outcome of training. For the dependent cue case it was found that even a slight difference between the posteriors for the odd and the even patterns can impair estimation of the posteriors.