papers AI Learner
The Github is limit! Click to go to the new site.

Uncertainty Estimations by Softplus normalization in Bayesian Convolutional Neural Networks with Variational Inference

2019-05-14
Kumar Shridhar, Felix Laumann, Marcus Liwicki

Abstract

We introduce a novel uncertainty estimation for classification tasks for Bayesian convolutional neural networks with variational inference. By normalizing the output of a Softplus function in the final layer, we estimate aleatoric and epistemic uncertainty in a coherent manner. The intractable posterior probability distributions over weights are inferred by Bayes by Backprop. Firstly, we demonstrate how this reliable variational inference method can serve as a fundamental construct for various network architectures. On multiple datasets in supervised learning settings (MNIST, CIFAR-10, CIFAR-100), this variational inference method achieves performances equivalent to frequentist inference in identical architectures, while the two desiderata, a measure for uncertainty and regularization are incorporated naturally. Secondly, we examine how our proposed measure for aleatoric and epistemic uncertainties is derived and validate it on the aforementioned datasets.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1806.05978

PDF

http://arxiv.org/pdf/1806.05978


Similar Posts

Comments