TF-CL: Time Series Forcasting Based on Time-Frequency Domain Contrastive Learning | SpringerLink
Skip to main content

TF-CL: Time Series Forcasting Based on Time-Frequency Domain Contrastive Learning

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2024 (ICANN 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15021))

Included in the following conference series:

  • 312 Accesses

Abstract

Accurately forecasting complex time series data is an essential task across a wide range of application scenarios. While the mainstream research on time series forecasting based on contrastive learning currently focuses on time-domain data augmentation to calculate contrastive loss, the importance of frequency domain analysis is often overlooked. Frequency domain analysis can reveal the main frequency components of the data, providing a unique perspective for understanding the periodicity, oscillation, and fluctuation characteristics of the data. We proposes time series forcasting based on time-frequency domain contrastive learning (TF-CL). TF-CL is constructed based on the Bayesian time series hypothesis and includes three core components: an encoder, a trend feature extractor, and a seasonal feature extractor, each utilizing contrastive loss functions in the time and frequency domains for feature extraction. Empirical analysis conducted on six cross-domain datasets shows that, compared to current benchmark methods, our model demonstrates superior predictive capabilities in the majority of cases. Furthermore, through detailed ablation studies, TF-CL further validates the critical role of integrating frequency domain data enhancement techniques with time-frequency domain analysis in improving predictive performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 7550
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 9437
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.bgc-jena.mpg.de/wetter/.

  2. 2.

    https://gis.cdc.gov/grasp/fluview/fluportaldashboard.html.

  3. 3.

    http://pems.dot.ca.gov.

References

  1. Cleveland, R.B., Cleveland, W.S., Terpenning, I.: STL: a seasonal-trend decomposition procedure based on loess. J. Off. Stat. 6(1), 3 (1990)

    Google Scholar 

  2. Du, H., Du, S., Li, W.: Probabilistic time series forecasting with deep non-linear state space models. CAAI Trans. Intell. Technol. 8(1), 3–13 (2023)

    Article  MathSciNet  Google Scholar 

  3. Du, H., Li, L., Huang, Z., Yu, X.: Object-goal visual navigation via effective exploration of relations among historical navigation states. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2563–2573 (2023)

    Google Scholar 

  4. Du, H., Yu, X., Zheng, L.: VTNet: visual transformer network for object goal navigation. In: ICLR 2021-9th International Conference on Learning Representations. International Conference on Learning Representations, ICLR (2021)

    Google Scholar 

  5. Eldele, E., et al.: Time-series representation learning via temporal and contextual contrasting. In: International Joint Conference on Artificial Intelligence (2021). https://api.semanticscholar.org/CorpusID:235658361

  6. Faloutsos, C., Flunkert, V., Gasthaus, J., Januschowski, T., Wang, Y.: Forecasting big time series: theory and practice. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 3209–3210 (2019)

    Google Scholar 

  7. Flandrin, P.: Time-frequency/time-scale analysis (1998). https://api.semanticscholar.org/CorpusID:60457766

  8. Franceschi, J.Y., Dieuleveut, A., Jaggi, M.: Unsupervised scalable representation learning for multivariate time series. In: Neural Information Processing Systems (2019). https://api.semanticscholar.org/CorpusID:59413908

  9. Gao, R., Du, L., Duru, O., Yuen, K.F.: Time series forecasting based on echo state network and empirical wavelet transformation. Appl. Soft Comput. 102, 107111 (2021)

    Article  Google Scholar 

  10. Hao, J., Liu, F.: Improving long-term multivariate time series forecasting with a seasonal-trend decomposition-based 2-dimensional temporal convolution dense network. Sci. Rep. 14(1), 1689 (2024)

    Article  Google Scholar 

  11. He, K., Fan, H., Wu, Y., Xie, S., Girshick, R.B.: Momentum contrast for unsupervised visual representation learning. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 9726–9735 (2019). https://api.semanticscholar.org/CorpusID:207930212

  12. He, K., Zhang, X., Ren, S., Sun, J.: Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: 2015 IEEE International Conference on Computer Vision (ICCV), pp. 1026–1034 (2015). https://api.semanticscholar.org/CorpusID:13740328

  13. Lin, Y., Chen, K., Zhang, X., Tan, B., Lu, Q.: Forecasting crude oil futures prices using BiLSTM-attention-CNN model with wavelet transform. Appl. Soft Comput. 130, 109723 (2022)

    Article  Google Scholar 

  14. Mikolov, T., Chen, K., Corrado, G.S., Dean, J.: Efficient estimation of word representations in vector space. In: International Conference on Learning Representations (2013). https://api.semanticscholar.org/CorpusID:5959482

  15. Rangapuram, S.S., Seeger, M.W., Gasthaus, J., Stella, L., Wang, Y., Januschowski, T.: Deep state space models for time series forecasting. Adv. Neural Inf. Process. Syst. 31 (2018)

    Google Scholar 

  16. Salinas, D., Flunkert, V., Gasthaus, J., Januschowski, T.: DeepAR: probabilistic forecasting with autoregressive recurrent networks. Int. J. Forecast. 36(3), 1181–1191 (2020)

    Article  Google Scholar 

  17. Shumway, R., Stoffer, D.S.: Time series analysis and its applications (2000). https://api.semanticscholar.org/CorpusID:117869442

  18. Tang, C.I., Perez-Pozuelo, I., Spathis, D., Mascolo, C.: Exploring contrastive learning in human activity recognition for healthcare. ArXiv abs/2011.11542 (2020). https://api.semanticscholar.org/CorpusID:227126857

  19. Tonekaboni, S., Eytan, D., Goldenberg, A.: Unsupervised representation learning for time series with temporal neighborhood coding. ArXiv abs/2106.00750 (2021). https://api.semanticscholar.org/CorpusID:235293778

  20. Trindade, A.: ElectricityLoadDiagrams20112014. UCI Machine Learning Repository (2015). https://doi.org/10.24432/C58C86

  21. Vaswani, A., et al.: Attention is all you need. Adv. Neural Inf. Process. Syst. 30 (2017)

    Google Scholar 

  22. Wen, L., Jiawei, C., Ruixue, L., Yuguo, H., Shouguo, D.: T-transformer model for predicting tensor time series. J. Comput. Eng. Appl. 59(11) (2023)

    Google Scholar 

  23. Woo, G., Liu, C., Sahoo, D., Kumar, A., Hoi, S.: Cost: contrastive learning of disentangled seasonal-trend representations for time series forecasting. arXiv preprint arXiv:2202.01575 (2022)

  24. Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)

    Google Scholar 

  25. Yue, Z., et al.: TS2Vec: towards universal representation of time series. In: AAAI Conference on Artificial Intelligence (2021). https://api.semanticscholar.org/CorpusID:237497421

  26. Zhang, X., Zhao, Z., Tsiligkaridis, T., Zitnik, M.: Self-supervised contrastive pre-training for time series via time-frequency consistency. ArXiv abs/2206.08496 (2022). https://api.semanticscholar.org/CorpusID:249848167

  27. Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: AAAI Conference on Artificial Intelligence (2020). https://api.semanticscholar.org/CorpusID:229156802

  28. Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, pp. 27268–27286. PMLR (2022)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shouguo Du .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, W., Gu, Y., Du, S. (2024). TF-CL: Time Series Forcasting Based on Time-Frequency Domain Contrastive Learning. In: Wand, M., Malinovská, K., Schmidhuber, J., Tetko, I.V. (eds) Artificial Neural Networks and Machine Learning – ICANN 2024. ICANN 2024. Lecture Notes in Computer Science, vol 15021. Springer, Cham. https://doi.org/10.1007/978-3-031-72347-6_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-72347-6_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-72346-9

  • Online ISBN: 978-3-031-72347-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics