papers AI Learner
The Github is limit! Click to go to the new site.

Backprop as Functor: A compositional perspective on supervised learning

2019-05-01
Brendan Fong, David I. Spivak, Rémy Tuyéras

Abstract

A supervised learning algorithm searches over a set of functions AB parametrised by a space P to find the best approximation to some ideal function f:AB. It does this by taking examples (a,f(a))A×B, and updating the parameter according to some rule. We define a category where these update rules may be composed, and show that gradient descent—with respect to a fixed step size and an error function satisfying a certain property—defines a monoidal functor from a category of parametrised functions to this category of update rules. This provides a structural perspective on backpropagation, as well as a broad generalisation of neural networks.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1711.10455

PDF

http://arxiv.org/pdf/1711.10455


Similar Posts

Comments