Recently, support vector machines (SVMs) have proven to have great potential for classification of remotely sensed hyperspectral data acquired in a large number of spectral bands. Due to Hughes' phenomenon, conventional parametric classifiers fail to classify such a high dimensional dataset. In the past, neural networks and decision tree classifiers, which are nonparametric in nature, have frequently been used for classification of multispectral remote sensing data. These, however, have marked limitations when applied to hyperspectral data. In this paper, we present the results of applying SVMs for a 16-class classification of an AVIRIS image and compare its performance with decision tree, back propagation (BP) and radial basis function (RBF) neural network classifiers. There are a number of parameters that may affect the accuracy of SVM based classifiers. The best values of these parameters have been selected on the basis of a set of hypotheses and experiments. All the SVM classifications have been performed using an in-house code developed in a Matlab environment. The Kappa coefficient of agreement has been used to assess the accuracy of classification. The differences in classification accuracy have been statistically evaluated using a pairwise Z-test. SVM classification using a polynomial kernel of degree 2 produced an accuracy of 96.94% whereas the accuracies achieved by decision tree, BP and RBF neural network classifiers were 74.75%, 38.03% and 95.30% respectively. This clearly illustrates that the accuracy of the SVM classifier is significantly higher than decision tree and BP neural network classifiers at 95% confidence level. Although, the improvement in SVM classification accuracy with respect to the RBF neural network classifier was not statistically significant, the SVM classifier was more computationally efficient than the RBF classifier for hyperspectral image classification.