Scaling multi-class support vector machines using inter-class confusion

Godbole, Shantanu ; Sarawagi, Sunita ; Chakrabarti, Soumen (2002) Scaling multi-class support vector machines using inter-class confusion ACM SIGKDD international conference on Knowledge discovery and data mining . p. 513.

[img] PDF
336kB

Official URL: http://doi.org/10.1145/775047.775122

Related URL: http://dx.doi.org/10.1145/775047.775122

Abstract

Support vector machines (SVMs) excel at two-class discriminative learning problems. They often outperform generative classifiers, especially those that use inaccurate generative models, such as the naïve Bayes (NB) classifier. On the other hand, generative classifiers have no trouble in handling an arbitrary number of classes efficiently, and NB classifiers train much faster than SVMs owing to their extreme simplicity. In contrast, SVMs handle multi-class problems by learning redundant yes/no (one-vs-others) classifiers for each class, further worsening the performance gap. We propose a new technique for multi-way classification which exploits the accuracy of SVMs and the speed of NB classifiers. We first use a NB classifier to quickly compute a confusion matrix, which is used to reduce the number and complexity of the two-class SVMs that are built in the second stage. During testing, we first get the prediction of a NB classifier and use that to selectively apply only a subset of the two-class SVMs. On standard benchmarks, our algorithm is 3 to 6 times faster than SVMs and yet matches or even exceeds their accuracy.

Item Type:Article
Source:Copyright of this article belongs to ACM, Inc
ID Code:128416
Deposited On:20 Oct 2022 08:57
Last Modified:20 Oct 2022 08:57

Repository Staff Only: item control page