papers AI Learner
The Github is limit! Click to go to the new site.

Deep Neural Architecture Search with Deep Graph Bayesian Optimization

2019-05-14
Lizheng Ma, Jiaxu Cui, Bo Yang

Abstract

Bayesian optimization (BO) is an effective method of finding the global optima of black-box functions. Recently BO has been applied to neural architecture search and shows better performance than pure evolutionary strategies. All these methods adopt Gaussian processes (GPs) as surrogate function, with the handcraft similarity metrics as input. In this work, we propose a Bayesian graph neural network as a new surrogate, which can automatically extract features from deep neural architectures, and use such learned features to fit and characterize black-box objectives and their uncertainty. Based on the new surrogate, we then develop a graph Bayesian optimization framework to address the challenging task of deep neural architecture search. Experiment results show our method significantly outperforms the comparative methods on benchmark tasks.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1905.06159

PDF

https://arxiv.org/pdf/1905.06159


Similar Posts

Comments