Abstract
Accurately forecasting complex time series data is an essential task across a wide range of application scenarios. While the mainstream research on time series forecasting based on contrastive learning currently focuses on time-domain data augmentation to calculate contrastive loss, the importance of frequency domain analysis is often overlooked. Frequency domain analysis can reveal the main frequency components of the data, providing a unique perspective for understanding the periodicity, oscillation, and fluctuation characteristics of the data. We proposes time series forcasting based on time-frequency domain contrastive learning (TF-CL). TF-CL is constructed based on the Bayesian time series hypothesis and includes three core components: an encoder, a trend feature extractor, and a seasonal feature extractor, each utilizing contrastive loss functions in the time and frequency domains for feature extraction. Empirical analysis conducted on six cross-domain datasets shows that, compared to current benchmark methods, our model demonstrates superior predictive capabilities in the majority of cases. Furthermore, through detailed ablation studies, TF-CL further validates the critical role of integrating frequency domain data enhancement techniques with time-frequency domain analysis in improving predictive performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Cleveland, R.B., Cleveland, W.S., Terpenning, I.: STL: a seasonal-trend decomposition procedure based on loess. J. Off. Stat. 6(1), 3 (1990)
Du, H., Du, S., Li, W.: Probabilistic time series forecasting with deep non-linear state space models. CAAI Trans. Intell. Technol. 8(1), 3–13 (2023)
Du, H., Li, L., Huang, Z., Yu, X.: Object-goal visual navigation via effective exploration of relations among historical navigation states. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2563–2573 (2023)
Du, H., Yu, X., Zheng, L.: VTNet: visual transformer network for object goal navigation. In: ICLR 2021-9th International Conference on Learning Representations. International Conference on Learning Representations, ICLR (2021)
Eldele, E., et al.: Time-series representation learning via temporal and contextual contrasting. In: International Joint Conference on Artificial Intelligence (2021). https://api.semanticscholar.org/CorpusID:235658361
Faloutsos, C., Flunkert, V., Gasthaus, J., Januschowski, T., Wang, Y.: Forecasting big time series: theory and practice. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 3209–3210 (2019)
Flandrin, P.: Time-frequency/time-scale analysis (1998). https://api.semanticscholar.org/CorpusID:60457766
Franceschi, J.Y., Dieuleveut, A., Jaggi, M.: Unsupervised scalable representation learning for multivariate time series. In: Neural Information Processing Systems (2019). https://api.semanticscholar.org/CorpusID:59413908
Gao, R., Du, L., Duru, O., Yuen, K.F.: Time series forecasting based on echo state network and empirical wavelet transformation. Appl. Soft Comput. 102, 107111 (2021)
Hao, J., Liu, F.: Improving long-term multivariate time series forecasting with a seasonal-trend decomposition-based 2-dimensional temporal convolution dense network. Sci. Rep. 14(1), 1689 (2024)
He, K., Fan, H., Wu, Y., Xie, S., Girshick, R.B.: Momentum contrast for unsupervised visual representation learning. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 9726–9735 (2019). https://api.semanticscholar.org/CorpusID:207930212
He, K., Zhang, X., Ren, S., Sun, J.: Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: 2015 IEEE International Conference on Computer Vision (ICCV), pp. 1026–1034 (2015). https://api.semanticscholar.org/CorpusID:13740328
Lin, Y., Chen, K., Zhang, X., Tan, B., Lu, Q.: Forecasting crude oil futures prices using BiLSTM-attention-CNN model with wavelet transform. Appl. Soft Comput. 130, 109723 (2022)
Mikolov, T., Chen, K., Corrado, G.S., Dean, J.: Efficient estimation of word representations in vector space. In: International Conference on Learning Representations (2013). https://api.semanticscholar.org/CorpusID:5959482
Rangapuram, S.S., Seeger, M.W., Gasthaus, J., Stella, L., Wang, Y., Januschowski, T.: Deep state space models for time series forecasting. Adv. Neural Inf. Process. Syst. 31 (2018)
Salinas, D., Flunkert, V., Gasthaus, J., Januschowski, T.: DeepAR: probabilistic forecasting with autoregressive recurrent networks. Int. J. Forecast. 36(3), 1181–1191 (2020)
Shumway, R., Stoffer, D.S.: Time series analysis and its applications (2000). https://api.semanticscholar.org/CorpusID:117869442
Tang, C.I., Perez-Pozuelo, I., Spathis, D., Mascolo, C.: Exploring contrastive learning in human activity recognition for healthcare. ArXiv abs/2011.11542 (2020). https://api.semanticscholar.org/CorpusID:227126857
Tonekaboni, S., Eytan, D., Goldenberg, A.: Unsupervised representation learning for time series with temporal neighborhood coding. ArXiv abs/2106.00750 (2021). https://api.semanticscholar.org/CorpusID:235293778
Trindade, A.: ElectricityLoadDiagrams20112014. UCI Machine Learning Repository (2015). https://doi.org/10.24432/C58C86
Vaswani, A., et al.: Attention is all you need. Adv. Neural Inf. Process. Syst. 30 (2017)
Wen, L., Jiawei, C., Ruixue, L., Yuguo, H., Shouguo, D.: T-transformer model for predicting tensor time series. J. Comput. Eng. Appl. 59(11) (2023)
Woo, G., Liu, C., Sahoo, D., Kumar, A., Hoi, S.: Cost: contrastive learning of disentangled seasonal-trend representations for time series forecasting. arXiv preprint arXiv:2202.01575 (2022)
Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)
Yue, Z., et al.: TS2Vec: towards universal representation of time series. In: AAAI Conference on Artificial Intelligence (2021). https://api.semanticscholar.org/CorpusID:237497421
Zhang, X., Zhao, Z., Tsiligkaridis, T., Zitnik, M.: Self-supervised contrastive pre-training for time series via time-frequency consistency. ArXiv abs/2206.08496 (2022). https://api.semanticscholar.org/CorpusID:249848167
Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: AAAI Conference on Artificial Intelligence (2020). https://api.semanticscholar.org/CorpusID:229156802
Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, pp. 27268–27286. PMLR (2022)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Li, W., Gu, Y., Du, S. (2024). TF-CL: Time Series Forcasting Based on Time-Frequency Domain Contrastive Learning. In: Wand, M., Malinovská, K., Schmidhuber, J., Tetko, I.V. (eds) Artificial Neural Networks and Machine Learning – ICANN 2024. ICANN 2024. Lecture Notes in Computer Science, vol 15021. Springer, Cham. https://doi.org/10.1007/978-3-031-72347-6_21
Download citation
DOI: https://doi.org/10.1007/978-3-031-72347-6_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-72346-9
Online ISBN: 978-3-031-72347-6
eBook Packages: Computer ScienceComputer Science (R0)