Global Boltzmann perceptron network for online learning of conditional distributions

Thathachar, M. A. L. ; Arvind, M. T. (1999) Global Boltzmann perceptron network for online learning of conditional distributions IEEE Transactions on Neural Networks, 10 (5). pp. 1090-1098. ISSN 1045-9227

Full text not available from this repository.

Official URL: http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arn...

Related URL: http://dx.doi.org/10.1109/TSMC.1975.5408434

Abstract

This paper proposes a backpropagation-based feedforward neural network for learning probability distributions of outputs conditioned on inputs using incoming input-output samples only. The backpropagation procedure is shown to locally minimize the Kullback-Leibler measure in an expected sense. The procedure is enhanced to facilitate boundedness of weights and exploration of the search space to reach a global minimum. The weak convergence theory is employed to show that the long-term behavior of the resulting algorithm can be approximated by that of a stochastic differential equation, whose invariant distributions are concentrated around the global minima of the Kullback-Leibler measure within a region of interest. Simulation studies on problems involving samples arriving from a mixture of labeled densities and the well-known Iris data problem demonstrate the speed and accuracy of the proposed procedure.

Item Type:Article
Source:Copyright of this article belongs to IEEE.
ID Code:51323
Deposited On:28 Jul 2011 15:01
Last Modified:28 Jul 2011 15:01

Repository Staff Only: item control page