papers AI Learner
The Github is limit! Click to go to the new site.

Efficient Neural Task Adaptation by Maximum Entropy Initialization

2019-05-25
Farshid Varno, Behrouz Haji Soleimani, Marzie Saghayi, Lisa Di Jorio, Stan Matwin

Abstract

Transferring knowledge from one neural network to another has been shown to be helpful for learning tasks with few training examples. Prevailing fine-tuning methods could potentially contaminate pre-trained features by comparably high energy random noise. This noise is mainly delivered from a careless replacement of task-specific parameters. We analyze theoretically such knowledge contamination for classification tasks and propose a practical and easy to apply method to trap and minimize the contaminant. In our approach, the entropy of the output estimates gets maximized initially and the first back-propagated error is stalled at the output of the last layer. Our proposed method not only outperforms the traditional fine-tuning, but also significantly speeds up the convergence of the learner. It is robust to randomness and independent of the choice of architecture. Overall, our experiments show that the power of transfer learning has been substantially underestimated so far.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1905.10698

PDF

http://arxiv.org/pdf/1905.10698


Similar Posts

Comments