papers AI Learner
The Github is limit! Click to go to the new site.

Associative Memory by Recurrent Neural Networks with Delay Elements

2002-09-12
Seiji Miyoshi, Hiro-Fumi Yanai, Masato Okada

Abstract

The synapses of real neural systems seem to have delays. Therefore, it is worthwhile to analyze associative memory models with delayed synapses. Thus, a sequential associative memory model with delayed synapses is discussed, where a discrete synchronous updating rule and a correlation learning rule are employed. Its dynamic properties are analyzed by the statistical neurodynamics. In this paper, we first re-derive the Yanai-Kim theory, which involves macrodynamical equations for the dynamics of the network with serial delay elements. Since their theory needs a computational complexity of $O(L^4t)$ to obtain the macroscopic state at time step t where L is the length of delay, it is intractable to discuss the macroscopic properties for a large L limit. Thus, we derive steady state equations using the discrete Fourier transformation, where the computational complexity does not formally depend on L. We show that the storage capacity $\alpha_C$ is in proportion to the delay length L with a large L limit, and the proportion constant is 0.195, i.e., $\alpha_C = 0.195 L$. These results are supported by computer simulations.

Abstract (translated by Google)
URL

https://arxiv.org/abs/cond-mat/0209258

PDF

https://arxiv.org/pdf/cond-mat/0209258


Similar Posts

Comments