papers AI Learner
The Github is limit! Click to go to the new site.

Uncertainty Propagation in Deep Neural Network Using Active Subspace

2019-03-10
Weiqi Ji, Zhuyin Ren, Chung K. Law

Abstract

The inputs of deep neural network (DNN) from real-world data usually come with uncertainties. Yet, it is challenging to propagate the uncertainty in the input features to the DNN predictions at a low computational cost. This work employs a gradient-based subspace method and response surface technique to accelerate the uncertainty propagation in DNN. Specifically, the active subspace method is employed to identify the most important subspace in the input features using the gradient of the DNN output to the inputs. Then the response surface within that low-dimensional subspace can be efficiently built, and the uncertainty of the prediction can be acquired by evaluating the computationally cheap response surface instead of the DNN models. In addition, the subspace can help explain the adversarial examples. The approach is demonstrated in MNIST datasets with a convolutional neural network.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1903.03989

PDF

https://arxiv.org/pdf/1903.03989


Similar Posts

Comments