papers AI Learner
The Github is limit! Click to go to the new site.

Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization

2019-04-03
Nicolas Y. Masse, Gregory D. Grant, David J. Freedman

Abstract

Humans and most animals can learn new tasks without forgetting old ones. However, training artificial neural networks (ANNs) on new tasks typically cause it to forget previously learned tasks. This phenomenon is the result of “catastrophic forgetting”, in which training an ANN disrupts connection weights that were important for solving previous tasks, degrading task performance. Several recent studies have proposed methods to stabilize connection weights of ANNs that are deemed most important for solving a task, which helps alleviate catastrophic forgetting. Here, drawing inspiration from algorithms that are believed to be implemented in vivo, we propose a complementary method: adding a context-dependent gating signal, such that only sparse, mostly non-overlapping patterns of units are active for any one task. This method is easy to implement, requires little computational overhead, and allows ANNs to maintain high performance across large numbers of sequentially presented tasks when combined with weight stabilization. This work provides another example of how neuroscience-inspired algorithms can benefit ANN design and capability.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1802.01569

PDF

http://arxiv.org/pdf/1802.01569


Similar Posts

Comments