A Robust Minimax Approach to Classification

Lanckriet, Gert R.G. ; Ghaoui, Laurent El ; Bhattacharyya, Chiranjib ; Jordan, Michael I. (2003) A Robust Minimax Approach to Classification CrossRef Listing of Deleted DOIs, 3 . pp. 555-582. ISSN 0003-6951

[img] PDF
443kB

Related URL: http://dx.doi.org/10.1162/153244303321897726

Abstract

When constructing a classifier, the probability of correct classification of future data points should be maximized. We consider a binary classification problem where the mean and covariance matrix of each class are assumed to be known. No further assumptions are made with respect to the class-conditional distributions. Misclassification probabilities are then controlled in a worst-case setting: that is, under all possible choices of class-conditional densities with given mean and covariance matrix, we minimize the worst-case (maximum) probability of misclassification of future data points. For a linear decision boundary, this desideratum is translated in a very direct way into a (convex) second order cone optimization problem, with complexity similar to a support vector machine problem. The minimax problem can be interpreted geometrically as minimizing the maximum of the Mahalanobis distances to the two classes. We address the issue of robustness with respect to estimation errors (in the means and covariances of the classes) via a simple modification of the input data. We also show how to exploit Mercer kernels in this setting to obtain nonlinear decision boundaries, yielding a classifier which proves to be competitive with current methods, including support vector machines. An important feature of this method is that a worst-case bound on the probability of misclassification of future data is always obtained explicitly.

Item Type:Article
Keywords:classification, kernel methods, convex optimization, second order cone programming
ID Code:127675
Deposited On:13 Oct 2022 10:59
Last Modified:13 Oct 2022 10:59

Repository Staff Only: item control page