Yegnanarayana, B. ; Kishore, S. P. (2002) AANN: an alternative to GMM for pattern recognition Neural Networks, 15 (3). pp. 459-469. ISSN 0893-6080
Full text not available from this repository.
Official URL: http://www.sciencedirect.com/science/article/pii/S...
Related URL: http://dx.doi.org/10.1016/S0893-6080(02)00019-9
Abstract
The objective in any pattern recognition problem is to capture the characteristics common to each class from feature vectors of the training data. While Gaussian mixture models appear to be general enough to characterize the distribution of the given data, the model is constrained by the fact that the shape of the components of the distribution is assumed to be Gaussian, and the number of mixtures are fixed a priori. In this context, we investigate the potential of non-linear models such as autoassociative neural network (AANN) models, which perform identity mapping of the input space. We show that the training error surface realized by the neural network model in the feature space is useful to study the characteristics of the distribution of the input data. We also propose a method of obtaining an error surface to match the distribution of the given data. The distribution capturing ability of AANN models is illustrated in the context of speaker verification.
Item Type: | Article |
---|---|
Source: | Copyright of this article belongs to Elsevier Science. |
Keywords: | Autoassociative Neural Network Models; Training Error Surface; Annealing Gain Parameter; Speaker Verification |
ID Code: | 57743 |
Deposited On: | 29 Aug 2011 11:52 |
Last Modified: | 29 Aug 2011 11:52 |
Repository Staff Only: item control page