papers AI Learner
The Github is limit! Click to go to the new site.

Learning Backpropagation-Free Deep Architectures with Kernels

2019-01-29
Shiyu Duan, Shujian Yu, Yunmei Chen, Jose Principe

Abstract

One can substitute each neuron in any neural network with a kernel machine and obtain a counterpart powered by kernel machines. The new network inherits the expressive power and architecture of the original but works in a more intuitive way since each node enjoys the simple interpretation as a hyperplane (in a reproducing kernel Hilbert space). Further, using the kernel multilayer perceptron as an example, we prove that in classification and under certain losses, an optimal representation that minimizes the risk of the network can be characterized for each hidden layer. This result removes the need of backpropagation in learning the model and can be generalized to any feedforward kernel network. Moreover, unlike backpropagation, which turns models into black boxes, the optimal hidden representation enjoys an intuitive geometric interpretation, making the dynamics of learning in a deep kernel network transparent. Empirical results are provided to complement our theory.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1802.03774

PDF

http://arxiv.org/pdf/1802.03774


Similar Posts

Comments