papers AI Learner
The Github is limit! Click to go to the new site.

Deep Learning using Rectified Linear Units

2019-02-07
Abien Fred Agarap

Abstract

We introduce the use of rectified linear units (ReLU) as the classification function in a deep neural network (DNN). Conventionally, ReLU is used as an activation function in DNNs, with Softmax function as their classification function. However, there have been several studies on using a classification function other than Softmax, and this study is an addition to those. We accomplish this by taking the activation of the penultimate layer hn1 in a neural network, then multiply it by weight parameters θ to get the raw scores oi. Afterwards, we threshold the raw scores oi by 0, i.e. f(o)=max(0,oi), where f(o) is the ReLU function. We provide class predictions ˆy through argmax function, i.e. argmax f(x).

Abstract (translated by Google)
URL

http://arxiv.org/abs/1803.08375

PDF

http://arxiv.org/pdf/1803.08375


Similar Posts

Comments