papers AI Learner
The Github is limit! Click to go to the new site.

Large-Scale Spectrum Occupancy Learning via Tensor Decomposition and LSTM Networks

2019-05-10
Mohsen Joneidi, Ismail Alkhouri, Nazanin Rahnavard

Abstract

A new paradigm for large-scale spectrum occupancy learning based on long short-term memory (LSTM) recurrent neural networks is proposed. Studies have shown that spectrum usage is a highly correlated time series. Moreover, there is a correlation for occupancy of spectrum between different frequency channels. Therefore, revealing all these correlations using learning and prediction of one-dimensional time series is not a trivial task. In this paper, we introduce a new framework for representing the spectrum measurements in a tensor format. Next, a time-series prediction method based on CANDECOMP/PARFAC (CP) tensor decomposition and LSTM recurrent neural networks is proposed. The proposed method is computationally efficient and is able to capture different types of correlation within the measured spectrum. Moreover, it is robust against noise and missing entries of sensed spectrum. The superiority of the proposed method is evaluated over a large-scale synthetic dataset in terms of prediction accuracy and computational efficiency.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1905.04392

PDF

http://arxiv.org/pdf/1905.04392


Similar Posts

Comments