Scaling multi-class support vector machines using inter-class confusion

Godbole, Shantanu ; Sarawagi, Sunita ; Chakrabarti, Soumen (2002) Scaling multi-class support vector machines using inter-class confusion In: ACM SIGKDD international conference on Knowledge discovery and data mining.

Full text not available from this repository.

Official URL: http://doi.org/10.1145/775047.775122

Related URL: http://dx.doi.org/10.1145/775047.775122

Abstract

Support vector machines (SVMs) excel at two-class discriminative learning problems. They often outperform generative classifiers, especially those that use inaccurate generative models, such as the naïve Bayes (NB) classifier. On the other hand, generative classifiers have no trouble in handling an arbitrary number of classes efficiently, and NB classifiers train much faster than SVMs owing to their extreme simplicity. In contrast, SVMs handle multi-class problems by learning redundant yes/no (one-vs-others) classifiers for each class, further worsening the performance gap. We propose a new technique for multi-way classification which exploits the accuracy of SVMs and the speed of NB classifiers. We first use a NB classifier to quickly compute a confusion matrix, which is used to reduce the number and complexity of the two-class SVMs that are built in the second stage. During testing, we first get the prediction of a NB classifier and use that to selectively apply only a subset of the two-class SVMs. On standard benchmarks, our algorithm is 3 to 6 times faster than SVMs and yet matches or even exceeds their accuracy.

Item Type:Conference or Workshop Item (Paper)
Source:Copyright of this article belongs to Association for Computing Machinery
ID Code:130961
Deposited On:02 Dec 2022 04:05
Last Modified:02 Dec 2022 04:05

Repository Staff Only: item control page