papers AI Learner
The Github is limit! Click to go to the new site.

Gauged Neural Network: Phase Structure, Learning, and Associative Memory

2006-06-04
Motohiro Kemuriyama, Tetsuo Matsui, Kazuhiko Sakakibara

Abstract

A gauge model of neural network is introduced, which resembles the Z(2) Higgs lattice gauge theory of high-energy physics. It contains a neuron variable $S_x = \pm 1$ on each site $x$ of a 3D lattice and a synaptic-connection variable $J_{x\mu} = \pm 1$ on each link $(x,x+\hat{\mu}) (\mu=1,2,3)$. The model is regarded as a generalization of the Hopfield model of associative memory to a model of learning by converting the synaptic weight between $x$ and $x+\hat{\mu}$ to a dynamical Z(2) gauge variable $J_{x\mu}$. The local Z(2) gauge symmetry is inherited from the Hopfield model and assures us the locality of time evolutions of $S_x$ and $J_{x\mu}$ and a generalized Hebbian learning rule. At finite “temperatures”, numerical simulations show that the model exhibits the Higgs, confinement, and Coulomb phases. We simulate dynamical processes of learning a pattern of $S_x$ and recalling it, and classify the parameter space according to the performance. At some parameter regions, stable column-layer structures in signal propagations are spontaneously generated. Mutual interactions between $S_x$ and $J_{x\mu}$ induce partial memory loss as expected.

Abstract (translated by Google)
URL

https://arxiv.org/abs/cond-mat/0203136

PDF

https://arxiv.org/pdf/cond-mat/0203136


Similar Posts

Comments