Abstract:Predictive resource allocation can harness residual resources in wireless networks to serve non-real-time service, where the prediction of residual resources plays a key role, which can be converted into the problem of predicting traffic load of real-time service. In this paper, a new neural network structure proposed for natural language processing, which is completely based on attention mechanism, is introduced to time sequence prediction, in particular traffic load prediction. By training and testing with a real traffic load dataset measured in each second, the multi-step prediction method with all-attention mechanism is compared with other methods using recurrent neural network (RNN) or linear and non-linear predictors, in terms of complexity (measured with training and testing time), prediction accuracy (measured with mean relative percentage error) and prediction error statistics (measured with the mean value and standard derivation of the prediction error). Simulation results show that the primary advantage of the designed neural network with all-attention mechanism over RNN without attention mechanism lies in low training and test complexity. Due to the accumulative errors in multi-step prediction, its gain in terms of prediction error is not obvious.
N. Bui and J. Widmer, Data-driven evaluation of anticipatory networking in LTE network[J]. IEEE Trans. Mobile Comput., 2018, 17(10): 2252-2265.
[2]
Yao C, Yang C, Xiong Z. Energy-Saving Predictive Resource Planning and Allocation[J]. IEEE Trans. Commun., 2016, 64(12): 5078-5095.
[3]
Guo J, Yang C, Chih-Lin I. Exploiting Future Radio Resources with End-to-end Prediction by Deep Learning[J]. IEEE Access, 2018, 6(1): 75729-75747.
[4]
Sadek N., Khotanzad A. and Chen T. ATM dynamic bandwidth allocation using FARIMA prediction model[C]//IEEE ICCCN 2013, 2013:359-363.
[5]
Zhang G P . Time series forecasting using a hybrid ARIMA and neural network model[J]. Neurocomputing, 2003, 50(none):159-175.
[6]
Xu Y, Xu W, Yin F, et al. High-Accuracy Wireless Traffic Prediction: A GP-Based Machine Learning Approach[C]//IEEE GLOBECOM 2017, 2017: 1-6.
[7]
Wang J, Tang J, Xu Z, et al. Spatiotemporal modeling and prediction in cellular networks: A big data enabled deep learning approach[C]//IEEE INFOCOM 2017, 2017: 1-9.
[8]
Nie L, Jiang D, Yu S, et al. Network traffic prediction based on deep belief network in wireless mesh backbone networks[C]//IEEE WCNC 2017, 2017: 1-5.
[9]
Li. R, Zhao Z., Zheng J, et al. The Learning and Prediction of Application-Level Traffic Data in Cellular Networks[J]. IEEE Trans. Wireless Commun., 2017, 16(6): 3899-3912.
[10]
Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]//NIPS 2017, 2017: 5998-6008.
[11]
Neubig G. Neural machine translation and sequence-to-sequence models: A tutorial[J]. arXiv preprint arXiv:1703.01619, 2017.
[12]
Sutskever I, Vinyals O, Le Q V, et al. Sequence to Sequence Learning with Neural Networks[C]//NIPS 2014, 2014: 3104-3112.
[13]
Goodfellow I, Bengio Y, Courville A, et al. Deep learning[M]. Cambridge: MIT press, 2016.
[14]
Kingma D P, Ba J. Adam: A method for stochastic optimization[J]. arXiv preprint arXiv:1412.6980, 2014.