Abstract
Predicting the future information and recovering the missing data for time series are two vital tasks faced in various application fields. They are often subjected to big challenges, especially when the signal is nonlinear and non-stationary which is common in practice. In this paper, we propose a hybrid 2-stage approach, named IF2FNN, to predict (including short-term and long-term predictions) and recover the general types of time series. In the first stage, we decompose the original non-stationary series into several “quasi stationary” intrinsic mode functions (IMFs) by the iterative filtering (IF) method. In the second stage, all of the IMFs are fed as the inputs to the factorization machine based neural network model to perform the prediction and recovery. We test the strategy on five datasets including an artificial constructed signal (ACS), and four real-world signals: the length of day (LOD), the northern hemisphere land-ocean temperature index (NHLTI), the troposphere monthly mean temperature (TMMT), and the national association of securities dealers automated quotations index (NASDAQ). The results are compared with those obtained from the other prevailing methods. Our experiments indicate that under the same conditions, the proposed method outperforms the others for prediction and recovery according to various metrics such as mean absolute error (MAE), root mean square error (RMSE), and mean absolute percentage error (MAPE).
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Safari N, Chung C Y, Price G C D. Novel multi-step short-term wind power prediction framework based on chaotic time series analysis and singular spectrum analysis. IEEE Transactions on Power Systems, 2018, 33(1): 590-601.
Oh K J, Kim K J. Analyzing stock market tick data using piecewise nonlinear model. Expert Systems with Applications, 2002, 22(3): 249-255.
Wang Y F. Mining stock price using fuzzy rough set system. Expert Systems with Applications, 2003, 24(1): 13-23.
Faruk D Ö. A hybrid neural network and ARIMA model for water quality time series prediction. Engineering Applications of Artificial Intelligence, 2010, 23(4): 586-594.
Kasabov N K, Song Q. DENFIS: Dynamic evolving neural-fuzzy inference system and its application for time-series prediction. IEEE Transactions on Fuzzy Systems, 2002, 10(2): 144-154.
Franses P H, Ghijsels H. Additive outliers, GRACH and forecasting volatility. International Journal of Forecasting, 1999, 15(1): 1-9.
Sarantis N. Nonlinearities, cyclical behaviour and predictability in stock markets: International evidence. International Journal of Forecasting, 2001, 17(3): 459-482.
Kalekar P S. Time series forecasting using Holt-Winters exponential smoothing. https://c.mql5.com/forextsd/forum/69/exponentialsmoothing.pdf, Jan. 2019.
Hansen J V, Nelson R D. Data mining of time series using stacked generalizers. Neurocomputing, 2002, 43(1/2/3/4): 173-184.
Zhang G P. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing, 2003, 50: 159-175.
Enke D, Thawornwong S. The use of data mining and neural networks for forecasting stock market returns. Expert Systems with Applications, 2005, 29(4): 927-940.
Ture M, Kurt I. Comparison of four different time series methods to forecast hepatitis a virus infection. Expert Systems with Applications, 2006, 31(1): 41-46.
Kim K J. Financial time series forecasting using support vector machines. Neurocomputing, 2003, 55(1/2): 307-319.
Qian X Y. Financial series prediction: Comparison between precision of time series models and machine learning methods. arXiv:1706.00948, 2017. https://arxiv.org/abs/1706.00948, June 2018.
Chen T, Guestrin C. Xgboost: A scalable tree boosting system. In Proc. the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, August 2016, pp.785-794.
Ye J, Chow J H, Chen J, Zheng Z. Stochastic gradient boosted distributed decision trees. In Proc. the 18th ACM-Conference on Information and Knowledge Management, November 2009, pp.2061-2064.
Kim K J, Han I. Genetic algorithms approach to feature discretization in artificial neural networks for the prediction of stock price index. Expert Systems with Applications, 2000, 19(2): 125-132.
Wang Y F. Predicting stock price using fuzzy grey prediction system. Expert Systems with Applications, 2002, 22(1): 33-38.
Shen L, Han T L. Applying rough sets to market timing decisions. Decision Support Systems, 2004, 37(4): 583-597.
Vellido A, Lisboa P J G, Meehan K. Segmentation of the on-line shopping market using neural networks. Expert Systems with Applications, 1999, 17(4): 303-314.
Chen A S, Leung M T, Daouk H. Application of neural networks to an emerging financial market: Forecasting and trading the Taiwan stock index. Computers and Operations Research, 2003, 30(6): 901-923.
Rather A M, Agarwal A, Sastry V N. Recurrent neural network and a hybrid model for prediction of stock returns. Expert Systems with Applications, 2015, 42(6): 3234-3241.
Yang Z, Yang L, Qi D. Detection of spindles in sleep EEGs using a novel algorithm based on the Hilbert-Huang transform. In Wavelet Analysis and Applications, Qian T, Vai M I, Xu Y S (eds.), Birkhäuser, 2007, pp.543-559.
Wang J Z, Wang J J, Zhang Z G, Guo S P. Forecasting stock indices with back propagation neural network. Expert Systems with Applications, 2011, 38(11): 14346-14355.
Liu H, Chen C, Tian H Q, Li Y F. A hybrid model for wind speed prediction using empirical mode decomposition and artificial neural networks. Renewable Energy, 2012, 48: 545-556.
Kao L J, Chiu C C, Lu C J, Chang C H. A hybrid approach by integrating wavelet-based feature extraction with MARS and SVR for stock index forecasting. Decision Support Systems, 2013, 54(3): 1228-1244.
Zhang L, Wu X, Ji W, Abourizk S M. Intelligent approach to estimation of tunnel-induced ground settlement using wavelet packet and support vector machines. Journal of Computing in Civil Engineering, 2016, 31(2): Article No. 04016053.
Wei L Y. A hybrid ANFIS model based on empirical mode decomposition for stock time series forecasting. Applied Soft Computing, 2016, 42: 368-376.
Zhou F, Zhou H, Yang Z, Yang L. EMD2FNN: A strategy combining empirical mode decomposition and factorization machine based neural network for stock market trend prediction. Expert Systems with Applications, 2019, 115: 136- 151.
Thompson W R, Weil C S. On the construction of tables for moving-average interpolation. Biometrics, 1952, 8(1): 51-54.
Watson D F. A refinement of inverse distance weighted interpolation. GeoProcessing, 1985, 2(4): 315-327.
Liu G R, Zhang G Y. A novel scheme of strain-constructed point interpolation method for static and dynamic mechanics problems. International Journal of Applied Mechanics, 2009, 1(1): 233-258.
Schoenberg I J. Contributions to the problem of approximation of equidistant data by analytic functions (part A). Quarterly of Applied Mathematics, 1946, 4: 3-57.
Schoenberg I J. Cardinal Spline Interpolation. Society for Industrial and Applied Mathematics, 1973.
Lin L, Wang Y, Zhou H. Iterative filtering as an alternative algorithm for empirical mode decomposition. Advances in Adaptive Data Analysis, 2009, 1(4): 543-560.
Cicone A, Liu J, Zhou H. Adaptive local iterative filtering for signal decomposition and instantaneous frequency analysis. Applied and Computational Harmonic Analysis, 2016, 41(2): 384-411.
Cicone A, Zhou H. Multidimensional iterative filtering method for the decomposition of high-dimensional nonstationary signals. Numerical Mathematics: Theory, Methods and Applications, 2017, 10(2): 278-298.
Huang N E, Shen Z, Long S R, Wu M C, Shih H H, Zheng Q, Yen N C, Chi C T, Liu H H. The empirical mode decomposition and the Hilbert spectrum for nonlinear and nonstationary time series analysis. Proceedings of the Royal Society A: Mathematical Physical and Engineering Sciences, 1998, 454(1971): 903-995.
Holt C C. Forecasting seasonals and trends by exponentially weighted moving averages. International Journal of Forecasting, 2004, 20(1): 5-10.
Winters P R. Forecasting sales by exponentially weighted moving averages. Management Science, 1960, 6(3): 231-362.
Flandrin P, Rilling G, Goncalves P. Empirical mode decomposition as a filter bank. IEEE Signal Processing Letters, 2004, 11(2): 112-114.
Zhou F, Yang L, Zhou H, Yang L. Optimal averages for nonlinear signal decompositions — Another alternative for empirical mode decomposition. Signal Processing, 2016, 121: 17-29.
Huang N E, Shen Z, Long S R. A new view of nonlinear water waves: The Hilbert spectrum. Annual Review of Fluid Mechanics, 1999, 31(1): 417-457.
Huang W, Shen Z, Huang N E, Yuan C F. Engineering analysis of biological variables: An example of blood pressure over 1 day. Proceedings of the National Academy of Sciences of the United States of America, 1998, 95(9): 4816-4821.
Yang Z, Qi D, Yang L. Signal period analysis based on Hilbert-Huang transform and its application to texture analysis. In Proc. the 3rd International Conference on Image and Graphics, April 2005, pp.430-433.
Smith J S. The local mean decomposition and its application to EEG perception data. Journal of the Royal Society Interface, 2005, 2(5): 443-454.
Delechelle E, Lemoine J, Niang O. Empirical mode decomposition: An analytical approach for sifting process. IEEE Signal Processing Letters, 2005, 12(11): 764-767.
Diop E H S, Alexandre R, Boudraa A O. Analysis of intrinsic mode functions: A PDE approach. IEEE Signal Processing Letters, 2010, 17(4): 398-401.
Hong H, Wang X, Tao Z. Local integral mean-based sifting for empirical mode decomposition. IEEE Signal Processing Letters, 2009, 16(10): 841-844.
Peng S, Hwang WL. Null space pursuit: An operator-based approach to adaptive signal separation. IEEE Transactions on Signal Processing, 2010, 58(5): 2475-2483.
Daubechies I, Lu J, Wu H T. Synchrosqueezed wavelet transforms: An empirical mode decomposition-like tool. Applied and Computational Harmonic Analysis, 2011, 30(2): 243-261.
Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(6): 1137-1149.
Shelhamer E, Long J, Darrell T. Fully convolutional networks for semantic segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(4): 640-651.
Hinton G, Deng L, Yu D, Dahl G E, Mohamed A, Jaitly N, Senior A, Vanhoucke V, Nguyen P, Sainath T N. Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Processing Magazine, 2012, 29(6): 82-97.
Chen C H. Handbook of Pattern Recognition and Computer Vision (5th edition). World Scientific Publishing, 2016.
Goldberg Y. Neural Network Methods for Natural Language Processing. Morgan and Claypool Publishers, 2017.
Rendle S. Factorization machines. In Proc. the 10th IEEE International Conference on Data Mining, December 2010, pp.995-1000.
Han J, Moraga C. The influence of the sigmoid function parameters on the speed of backpropagation learning. In Proc. International Workshop on Artificial Neural Networks: From Natural to Artificial Neural Computation, June 1995, pp.195-201.
Lecun Y, Bengio Y, Hinton G. Deep learning. Nature, 2015, 521(7553): 436-444.
He K, Zhang X, Ren S, Sun J. Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification. In Proc. the 2015 IEEE International Conference on Computer Vision, December 2015, pp.1026-1034.
Clevert D A, Unterthiner T, Hochreiter S. Fast and accurate deep network learning by exponential linear units (ELUs). arXiv:1511.07289, 2015. https://arxiv.org/pdf/1511.07289.pdf, November 2018.
Makridakis S. Accuracy measures: Theoretical and practical concerns. International Journal of Forecasting, 1993, 9(4): 527-529.
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
ESM 1
(PDF 69 kb)
Rights and permissions
About this article
Cite this article
Zhou, F., Zhou, HM., Yang, ZH. et al. A 2-Stage Strategy for Non-Stationary Signal Prediction and Recovery Using Iterative Filtering and Neural Network. J. Comput. Sci. Technol. 34, 318–338 (2019). https://doi.org/10.1007/s11390-019-1913-0
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11390-019-1913-0