On data depth and distribution-free discriminant analysis using separating surfaces

Ghosh, Anil K. ; Chaudhuri, Probal (2005) On data depth and distribution-free discriminant analysis using separating surfaces Bernoulli, 11 (1). pp. 1-27. ISSN 1350-7265

[img]
Preview
PDF - Publisher Version
380kB

Official URL: http://projecteuclid.org/DPubS?service=UI&version=...

Related URL: http://dx.doi.org/10.3150/bj/1110228239

Abstract

A very well-known traditional approach in discriminant analysis is to use some linear (or nonlinear) combination of measurement variables which can enhance class separability. For instance, a linear (or a quadratic) classifier finds the linear projection (or the quadratic function) of the measurement variables that will maximize the separation between the classes. These techniques are very useful in obtaining good lower-dimensional views of class separability. Fisher's discriminant analysis, which is primarily motivated by the multivariate normal distribution, uses the first-and second-order moments of the training sample to build such classifiers. These estimates, however, are highly sensitive to outliers, and they are not reliable for heavy-tailed distributions. This paper investigates two distribution-free methods for linear classification, which are based on the notions of statistical depth functions. One of these classifiers is closely related to Tukey's half-space depth, while the other is based on the concept of regression depth. Both these methods can be generalized for constructing nonlinear surfaces to discriminate among competing classes. These depth-based methods assume some finite-dimensional parametric form of the discriminating surface and use the distributional geometry of the data cloud to build the classifier. We use a few simulated and real data sets to examine the performance of these discriminant analysis tools and study their asymptotic properties under appropriate regularity conditions.

Item Type:Article
Source:Copyright of this article belongs to International Statistical Institute.
Keywords:Bayes Risk; Elliptic Symmetry; Generalized U-statistic; Half-space Depth; Linear Discriminant Analysis; Location-shift Models; Misclassification Rates; Optimal Bayes Classifier; Quadratic Discriminant Analysis; Regression Depth; Robustness; Vapnik-chervonenkis Dimension
ID Code:8120
Deposited On:26 Oct 2010 04:25
Last Modified:16 May 2016 18:11

Repository Staff Only: item control page