papers AI Learner
The Github is limit! Click to go to the new site.

Dense Morphological Network: An Universal Function Approximator

2019-01-01
Ranjan Mondal, Sanchayan Santra, Bhabatosh Chanda

Abstract

Artificial neural networks are built on the basic operation of linear combination and non-linear activation function. Theoretically this structure can approximate any continuous function with three layer architecture. But in practice learning the parameters of such network can be hard. Also the choice of activation function can greatly impact the performance of the network. In this paper we are proposing to replace the basic linear combination operation with non-linear operations that do away with the need of additional non-linear activation function. To this end we are proposing the use of elementary morphological operations (dilation and erosion) as the basic operation in neurons. We show that these networks (Denoted as DenMo-Net) with morphological operations can approximate any smooth function requiring less number of parameters than what is necessary for normal neural networks. The results show that our network perform favorably when compared with similar structured network.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1901.00109

PDF

https://arxiv.org/pdf/1901.00109


Similar Posts

Comments