DGNN-MN: Dynamic Graph Neural Network via memory regenerate and neighbor propagation | Applied Intelligence Skip to main content

Advertisement

Log in

DGNN-MN: Dynamic Graph Neural Network via memory regenerate and neighbor propagation

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Dynamic Graph Neural Network (DGNN) models have been widely used for modelling, prediction and recommendation tasks in domains such as e-commerce and social networks, due to their ability to capture node interaction features and temporal features. Current methods for dynamic graph representation learning mainly depend on querying K-hop neighbors and the triadic closure law to derive node representations. However, as the number of layers in neural networks increases, this can cause problems with over-smoothing and overly complex calculations. Additionally, current models cannot ensure that events arrive at adjacent nodes in chronological order according to timestamps. To address these problems, we propose a Dynamic Graph Neural Network via Memory Regenerate and Neighbor Propagation(DGNN-MN) model. The model presents a memory regeneration strategy for obtaining node time information features and a time edge-propagating method for obtaining neighbour information. By combining these two methods to fuse output vectors, it captures node feature representations. In addition, we present a strategy for the timestamp encoding of node messages, which effectively ensures that node messages propagate to neighboring nodes in an ordered manner according to timestamps, thereby better capturing the temporal characteristics of events in dynamic graphs. Extensive experiments conducted on five public datasets demonstrate the effectiveness of DGNN-MN for link prediction and node classification task. Furthermore, the method outperforms other state-of-the-art methods. The data and code are available on GitHub.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Algorithm 1
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Availability of supporting data

The datasets used in the experiments are publicly available in the online repository.

Notes

  1. http://snap.stanford.edu/jodie/reddit.csv

  2. http://snap.stanford.edu/jodie/wikipedia.csv

  3. https://www.cs.cmu.edu/~./enron/

  4. https://dblp.uni-trier.de/

  5. https://github.com/LiuRS1/DGNN-MN

References

  1. Kumar S, Zhang X, Leskovec J (2019) Predicting dynamic embedding trajectory in temporal interaction networks. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. pp 1269–1278

  2. Rossi E, Chamberlain B, Frasca F, Eynard D, Monti F, Bronstein M (2020) Temporal graph networks for deep learning on dynamic graphs. arXiv:2006.10637

  3. Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2020) Inductive representation learning on temporal graphs. arXiv:2002.07962

  4. Zhang Y, Xiong Y, Li D, Shan C, Ren K, Zhu Y (2021) Cope: modeling continuous propagation and evolution on interaction graph. In: Proceedings of the 30th ACM international conference on information & knowledge management. pp 2627–2636

  5. Ma Y, Guo Z, Ren Z, Tang J, Yin D (2020) Streaming graph neural networks. In: Proceedings of the 43rd international ACM SIGIR conference on research and development in information retrieval. pp 719–728

  6. Zhang Z, Bu J, Ester M, Zhang J, Yao C, Li Z, Wang C (2020) Learning temporal interaction graph embedding via coupled memory networks. Proc Web Conf 2020:3049–3055

    Google Scholar 

  7. Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2019) Self-attention with functional time representation learning. Adv Neur Inf Process Syst 32

  8. Hu W, Yang Y, Cheng Z, Yang C, Ren X (2021) Time-series event prediction with evolutionary state graph. In: Proceedings of the 14th ACM international conference on web search and data mining. pp 580–588

  9. Chang X, Liu X, Wen J, Li S, Fang Y, Song L, Qi Y (2020) Continuous-time dynamic graph learning via neural interaction processes. In: Proceedings of the 29th ACM international conference on information & knowledge management. pp 145–154

  10. Huang W, Zhang T, Rong Y, Huang J (2018) Adaptive sampling towards fast graph representation learning. Adv Neur Inf Process Syst 31

  11. Kang H, Ho J-H, Mesquita D, Pérez J, Souza AH (2021) Online graph nets

  12. Zhang M, Wu S, Yu X, Liu Q, Wang L (2022) Dynamic graph neural networks for sequential recommendation. IEEE Trans Knowl Data Eng 35(5):4741–4753

    Google Scholar 

  13. Wang X, Lyu D, Li M, Xia Y, Yang Q, Wang X, Wang X, Cui P, Yang Y, Sun B, et al. (2021) Apan: Asynchronous propagation attention network for real-time temporal graph embedding. In: Proceedings of the 2021 international conference on management of data. pp 2628–2638

  14. Pareja A, Domeniconi G, Chen J, Ma T, Suzumura T, Kanezashi H, Kaler T, Schardl T, Leiserson C (2020) Evolvegcn: Evolving graph convolutional networks for dynamic graphs. Proc AAAI Conf Artif Intell 34:5363–5370

    Google Scholar 

  15. Hu B, Wu Z, Zhou J, Liu Z, Huangfu Z, Zhang Z, Chen C (2022) Merit: Learning multi-level representations on temporal graphs. IJCAI

  16. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  17. Chung J, Gulcehre C, Cho K, Bengio Y (2014) Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv:1412.3555

  18. Zhou J, Cui G, Hu S, Zhang Z, Yang C, Liu Z, Wang L, Li C, Sun M (2020) Graph neural networks: A review of methods and applications. AI open. 1:57–81

    Article  Google Scholar 

  19. Gori M, Monfardini G, Scarselli F (2005) A new model for learning in graph domains. In: Proceedings. 2005 IEEE international joint conference on neural networks, 2005., vol. 2. IEEE, pp 729–734

  20. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:1609.02907

  21. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. Adv Neur Inf Process Syst 30

  22. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio, Y (2017) Graph attention networks. arXiv:1710.10903

  23. Bhatti UA, Tang H, Wu G, Marjan S, Hussain A (2023) Deep learning with graph convolutional networks: An overview and latest applications in computational intelligence. Int J Intell Syst 2023:1–28

    Article  Google Scholar 

  24. Du Y, Wang L, Feng D, Wang G, Ji S, Gomes CP, Ma Z-M et al (2024) A new perspective on building efficient and expressive 3d equivariant graph neural networks. Adv Neur Inf Process Syst 36

  25. Veličković P, Fedus W, Hamilton WL, Liò P, Bengio Y, Hjelm RD (2018) Deep graph infomax. arXiv:1809.10341

  26. Xia W, Li Y, Tian J, Li S (2021) Forecasting interaction order on temporal graphs. In: Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining. pp 1884–1893

  27. Wang X, Ji H, Shi C, Wang B, Ye Y, Cui P, Yu PS (2019) Heterogeneous graph attention network. In: The world wide web conference. pp 2022–2032

  28. Fu J, Li C, Zhao Z, Zeng Q (2023) Heterogeneous graph knowledge distillation neural network incorporating multiple relations and cross-semantic interactions. Inf Sci 658:120004

    Article  Google Scholar 

  29. Li C, Fu J, Yan Y, Zhao Z, Zeng Q (2024) Higher order heterogeneous graph neural network based on node attribute enhancement. Expert Syst Appl 238:122404

  30. Yu L, Sun L, Du B, Lv W (2024) Towards better dynamic graph learning: New architecture and unified library. Adv Neur Inf Process Syst 36

  31. Saxena R, Pati SP, Kumar A, Jadeja M, Vyas P, Bhateja V, Lin JCW (2023) An efficient bet-gcn approach for link prediction. IJIMAI. 8(1):38–52

    Article  Google Scholar 

  32. Chen D, Wen J, Lv C (2023) A spatio-temporal attention graph convolutional networks for sea surface temperature prediction

  33. Zhang Y, Xiong Y, Liao Y, Sun Y, Jin Y, Zheng X, Zhu Y (2023) Tiger: temporal interaction graph embedding with restarts. Proc ACM Web Conf 2023:478–488

    Google Scholar 

  34. Luo L, Haffari G, Pan S (2023) Graph sequential neural ode process for link prediction on dynamic and sparse graphs. In: Proceedings of the sixteenth ACM international conference on web search and data mining. pp. 778–786

  35. Nguyen GH, Lee JB, Rossi RA, Ahmed NK, Koh E, Kim S (2019) Continuous-time dynamic network embeddings. Comp Proc Web Conf 2018:969–976

    Google Scholar 

  36. Zhang Y, Xiong Y, Kong X, Niu Z, Zhu Y (2019) Ige+: A framework for learning node embeddings in interaction graphs. IEEE Trans Knowl Data Eng 33(3):1032–1044

    Google Scholar 

  37. Zhang Y, Xiong Y, Kong X, Zhu Y (2017) Learning node embeddings in interaction graphs. In: Proceedings of the 2017 ACM on conference on information and knowledge management. pp 397–406

  38. Trivedi R, Farajtabar M, Biswal P, Zha H (2019) Dyrep: Learning representations over dynamic graphs. In: International conference on learning representations

  39. Li Y, Shen Y, Chen L, Yuan M (2023) Zebra: When temporal graph neural networks meet temporal personalized pagerank. Proc VLDB Endow 16(6):1332–1345

    Article  Google Scholar 

  40. Poursafaei F, Huang S, Pelrine K, Rabbany R (2022) Towards better evaluation for dynamic link prediction. Adv Neural Inf Process Syst 35:32928–32941

    Google Scholar 

Download references

Funding

This work is supported by National Key R&D Program of China(Grant No.2022ZD0119501); National Natural Science Foundation of China (Grant No.62072288, 52374221); the Natural Science Foundation of Shandong Province (Grant No.ZR2022MF268, ZR2021QG038); ShandongYouth Innovation Team; the Taishan Scholar Program of Shandong Province (Grant No.tsqn202211154, ts20190936), the ‘Qunxing Plan’ project of educational and teaching research of Shandong University of Science and Technology(Grant No. QX2020Z12).

Author information

Authors and Affiliations

Authors

Contributions

Chao Li, Runshuo Liu, Jinhu Fu, Zhongying Zhao, Hua Duan, Qingtian Zeng wrote the main manuscript text; Runshuo Liu and Jinhu Fu prepared the result of our experiments; All authors reviewed the manuscript.

Corresponding author

Correspondence to Chao Li.

Ethics declarations

Ethical Approval

Not applicable.

Consent to participate

There is the consent of all authors.

Human and Animal Ethics

Not applicable.

Consent for publication

There is the consent of all authors.

Competing interests

The authors declare that there is no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

https://github.com/LiuRS1/DGNN-MN.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, C., Liu, R., Fu, J. et al. DGNN-MN: Dynamic Graph Neural Network via memory regenerate and neighbor propagation. Appl Intell 54, 9253–9268 (2024). https://doi.org/10.1007/s10489-024-05500-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-024-05500-3

Keywords