Tractable Theories for the Synthesis of Neural Networks

Chandru, V. ; Vidyasagar, M. ; Vinay, V. (1997) Tractable Theories for the Synthesis of Neural Networks Interfaces in Computer Science and Operations Research, 7 . pp. 97-121. ISSN 1387-666X

Full text not available from this repository.

Official URL: http://doi.org/10.1007/978-1-4615-4102-8_3

Related URL: http://dx.doi.org/10.1007/978-1-4615-4102-8_3

Abstract

The Radius of Direct attraction of a discrete neural network is a measure of stability of the network. It is known that Hopfield networks designed using Hebb’s Rule have a radius of direct attraction of where n is the size of the input patterns and p is the number of them. This lower bound is tight if p is no larger than 4. We construct a family of such networks with radius of direct attraction for anv P ≥ 5. The techniques used to prove the result led us to the first polynomial-time algorithm for designing a neural network with maximum radius of direct attraction around arbitrary input patterns. The optimal synaptic matrix is computed using the ellipsoid method of linear programming in conjunction with an efficient separation oracle. Restrictions of symmetry and non-negative diagonal entries in the synaptic matrix can be accommodated within this scheme. We also present a stability theory for generalized analog neural networks with energy functions that are multilinear polynomials. The main conclusions are that (i) the network is totally stable, and (ii) “almost all” trajectories of the network converge to a local minimum of the energy function. This is the largest class of functions for which sharp convergence properties are known.

Item Type:Article
Source:Copyright of this article belongs to Springer Nature Switzerland AG
Keywords:Neural Network;Weight Matrix;Input Pattern;Direct Attraction;Linear Programming Formulation
ID Code:132606
Deposited On:20 Dec 2022 08:09
Last Modified:20 Dec 2022 08:09

Repository Staff Only: item control page