papers AI Learner
The Github is limit! Click to go to the new site.

Maximum memory capacity on neural networks with short-term depression and facilitation

2008-09-11
Jorge F. Mejias, Joaquin J. Torres

Abstract

In this work we study, analytically and employing Monte Carlo simulations, the influence of the competition between several activity-dependent synaptic processes, such as short-term synaptic facilitation and depression, on the maximum memory storage capacity in a neural network. In contrast with the case of synaptic depression, which drastically reduces the capacity of the network to store and retrieve “static” activity patterns, synaptic facilitation enhances the storage capacity in different contexts. In particular, we found optimal values of the relevant synaptic parameters (such as the neurotransmitter release probability or the characteristic facilitation time constant) for which the storage capacity can be maximal and similar to the one obtained with static synapses, that is, without activity-dependent processes. We conclude that depressing synapses with a certain level of facilitation allow to recover the good retrieval properties of networks with static synapses while maintaining the nonlinear characteristics of dynamic synapses, convenient for information processing and coding.

Abstract (translated by Google)
URL

https://arxiv.org/abs/0809.2010

PDF

https://arxiv.org/pdf/0809.2010


Similar Posts

Comments