HGNN-ETA: Heterogeneous graph neural network enriched with text attribute | World Wide Web Skip to main content
Log in

HGNN-ETA: Heterogeneous graph neural network enriched with text attribute

  • Published:
World Wide Web Aims and scope Submit manuscript

Abstract

Heterogeneous graphs can accurately and effectively model rich semantic information and complex network relationships in the real world. As a deep representation model for nodes, heterogeneous graph neural networks (HGNNs) offer powerful graph data processing capabilities and exhibit outstanding performance in network analysis tasks. However, the traditional HGNNs model cannot mine all the semantic text information during text attribute vector learning; this results in a loss of node information, which degrades the model performance. In this study, we propose a novel framework for the HGNN model enriched with text attribute (HGNN-ETA), which incorporates the text attributes of the nodes. This process involves the attention mechanism-based encoding of the node text attributes, neighborhood aggregation mechanism-based completion of the node attributes, and end-to-end construction and optimization of a heterogeneous graph model. Extensive experiments and comparisons were conducted on two real-world heterogeneous graph-structured datasets. The results demonstrate that the proposed framework is more effective than other state-of-the-art models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Algorithm 1
Figure 7
Figure 8

Similar content being viewed by others

Notes

  1. https://dblp.uni-trier.de

  2. https://www.imdb.com/

  3. https://github.com/wz1714748313/HGNN-ETA

References

  1. Hamilton, W.L., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: Conference and Workshop on Neural Information Processing Systems, pp. 1025–1035 (2017)

  2. Wang, D., Cui, P., Zhu, W.: Structural deep network embedding. In: ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 1225–1234 (2016)

  3. Atwood, J., Towsley, D.: Diffusion-convolutional neural networks. In: Conference and Workshop on Neural Information Processing Systems, pp. 2001–2009 (2016)

  4. Kipf, T.N, Welling, M.: Semi-Supervised Classification with graph convolutional networks. In: International Conference on Learning Representations, pp. 2001–2009 (2017)

  5. Li, Y., Yu, R., Shahabi, C., Liu, Y.: Diffusion convolutional recurrent neural network: Data-Driven traffic forecasting international conference on learning representations (2018)

  6. Zhang, J., Shi, X., Xie, J., Ma, H., King, I., Yeung, D.-Y.: GaAN: Gated attention networks for learning on large and spatiotemporal graphs. In: International Conference on Uncertainty in Artificial Intelligence, pp. 339–349 (2018)

  7. Zonghan, W., Pan, S., Chen, F., Long, G., Zhang, C., Philip, S.Y.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32(1), 4–24 (2021)

    Article  MathSciNet  Google Scholar 

  8. Zhou, J., Cui, G., Hu, S., Zhang, Z., Yang, C., Liu, Z., Wang, L., Li, C., Sun, M.: Graph neural networks: a review of methods and applications. AI Open 1, 57–81 (2020)

    Article  Google Scholar 

  9. Yun, S., Jeong, M., Kim, R., Kang, J., Kim, H.J.: Graph transformer networks. In: Conference and Workshop on Neural Information Processing Systems, pp. 11960–11970 (2019)

  10. Cen, Y., Xu, Z., Zhang, J., Yang, H., Zhou, J., Tang, J.: Representation learning for attributed multiplex heterogeneous network. In: ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 1358–1368 (2019)

  11. Kipf, T.N., Welling, M.: Variational graph Auto-Encoders. In: Conference and Workshop on Neural Information Processing Systems. arXiv:1611.07308 (2016)

  12. Schlichtkrull, M., Kipf, T.N, Bloem, P., Berg, R.V.D, Titov, I., Welling, M.: Modeling Relational Data with Graph Convolutional Networks Lecture Notes in Computer Science, vol. 10843. Springer, Cham (2017)

    Google Scholar 

  13. Yang, C., Wang, C., Lu, Y., Gong, X., Shi, C., Wang, W., Zhang, X.: Few-shot link prediction in dynamic networks. In: ACM International Conference on Web Search And Data Mining, pp. 1245–1255 (2021)

  14. Duvenaud, D., Maclaurin, D., Aguilera-Iparraguirre, J., Gómez-Bombarelli, R., Hirzel, T., Aspuru-Guzik, A., Adams, R.P.: Convolutional networks on graphs for learning molecular fingerprints. In: Conference and Workshop on Neural Information Processing Systems, arXiv:1509.09292 (2015)

  15. Lee, J., Lee, I., Kang, J.: Pooling. In: Self-attention Graph International Conference on Machine Learning, pp. 3734–3743 (2019)

  16. Wang, X., Ji, H., Shi, C., Wang, B., Ye, Y., Cui, P., Yu, P.S.: Heterogeneous graph attention network. In: The International Conference of World Wide Web, pp. 2022–2032 (2019)

  17. Fu, X., Zhang, J., Meng, Z., King, I.: MAGNN: Metapath aggregated graph neural network for heterogeneous graph embedding. In: The International Conference of World Wide Web, pp. 2331–2341 (2020)

  18. Hu, B., Zhang, Z., Shi, C., Zhou, J., Li, X., Qi, Y.: Cash-Out User detection based on attributed heterogeneous information network with a hierarchical attention mechanism. The Assoc. Adv. Artif. Intell., 946–953 (2019)

  19. Zhao, J., Wang, X., Shi, C., Liu, Z., Ye, Y.: Network schema preserving heterogeneous information network embedding. In: International Joint Conference on Artificial Intelligence, pp. 1366–1372 (2020)

  20. Zhao, S., Du, Z., Chen, J., et al.: Hierarchical representation learning for attributed Networks[J]. IEEE Transactions on Knowledge and Data Engineering (2021)

  21. Zhao, S., Du, Z., Chen, J., Zhang, Y., Tang, J., Yu, P.S.: Hierarchical representation learning for attributed networks. In: 2022 IEEE 38th International Conference on Data Engineering (ICDE), pp. 1497–1498, https://doi.org/10.1109/ICDE53745.2022.00125 (2022)

  22. Di, J., Huo, C., Liang, C., Yang, L.: Heterogeneous graph neural network via attribute completion. In: The International Conference of World Wide Web, pp. 391–400 (2021)

  23. Yan, Y., Li, R., Wang, S., Zhang, F., Wu, W., Xu, W.: ConSERT: A contrastive framework for self-supervised sentence representation transfer. In: Annual Meeting of the Association for Computational Linguistics. arXiv:2105.11741 (2021)

  24. Conneau, A., Kiela, D., Schwenk, H., Barrault, L., Bordes, A.: Supervised learning of universal sentence representations from natural language inference data. In: Conference on Empirical Methods in Natural Language Processing, pp. 670–680 (2017)

  25. Landauer, T.K.: Latent Semantic Analysis Wiley interdisciplinary reviews (2004)

  26. Blei, DM, Ng, A.Y., Jordan, M.I.: Latent dirichlet allocation. Ann. Appl. Stat. 1(1), 17–35 (2001)

    MATH  Google Scholar 

  27. Kim, Y.: Convolutional neural networks for sentence classification. Eprint, arXiv:1408.5882 (2014)

  28. Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, https://doi.org/10.18653/v1/N16-1174 (2016)

  29. Ramos, J.: Using TF-IDF to determine word relevance in document queries (2003)

  30. Kumar, MP: Scene classification via pLSA. European Conference on Computer Vision 3954, 517–530 (2006)

    Google Scholar 

  31. Goldberg, Y., Levy, O.: Word2vec Explained: deriving Mikolov et al.’s negative-sampling word-embedding method. arXiv:1402.3722 (2014)

  32. Joulin, A., Grave, E., Bojanowski, P., Douze, M., Jégou, H., Mikolov, T.: Fasttext.zip: Compressing text classification models. International Conference on Learning Representations. arXiv:https://axiv.org/abs/1612.03651 (2016)

  33. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., Polosukhin, I.: Attention is all you need. In: Conference and Workshop on Neural Information Processing Systems, pp 6000–6010 (2017)

  34. Perozzi, B., Al-Rfou, R., Skiena, S.: Deepwalk: Online learning of social representations. In: ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 701–710 (2014)

  35. Tang, J., Qu, M., Wang, M., Zhang, M., Yan, J., Mei, Q.: Line: Large-scale information network embedding. In: The International Conference of World Wide Web, pp. 1067–1077 (2015)

  36. Dong, Y., Chawla, N.V., Swami, A.: Metapath2vec: Scalable representation learning for heterogeneous networks. In: ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 135–144 (2017)

  37. Shi, C., Binbin, H., Zhao, W.X., Yu, P.S.: Heterogeneous information network embedding for recommendation. IEEE Trans. Knowl. Data Eng. 31(2), 357–370 (2018)

    Article  Google Scholar 

  38. Fu, T.-Y., Lee, W.-C., Lei, Z.: HIN2Vec: Explore meta-paths in heterogeneous information networks for representation learning. In: ACM Sixteenth Conference on Information and Knowledge Management, pp. 1797–1806 (2017)

  39. Chen, H., Yin, H., Wang, W., Wang, H., Nguyen, Q.V.H., Li, X.: PME: Projected metric embedding on heterogeneous networks for link prediction. ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 1177–1186 (2018)

  40. Bo, D., Wang, X., Shi, C., Shen, H.: Beyond low-frequency information in graph convolutional networks. The Association for the Advance of Artificial Intelligence (2021)

  41. Hamilton, W.L., Ying, Z., Leskovec, J.: Inductive represen-tation learning on large graphs. In: Conference and Workshop on Neural Information Processing Systems, pp. 1024–1034 (2017)

  42. Velickovic, P., Cucurull, G., Casanova, A., Pietrolio, R.A., Bengio, Y.: Graph attention networks international conference on learning representations (2018)

  43. Zhang, C., Song, D., Huang, C., Swami, A., Chawla, N.V.: Heterogeneous graph neural network. In: ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 793–803 (2019)

  44. Gebraeel, N., Lawley, M., Liu, R., Parmeshwaran, V.: Residual life predictions from vibration-based degradation signals: a neural network approach. IEEE Trans. Ind. Electron. 51(3), 694–700 (2004)

    Article  Google Scholar 

  45. Wu, K., Peng, H., Chen, M., Fu, J., Chao, H.: Rethinking and improving relative position encoding for vision transformer. In: IEEE International Conference on Computer Vision, pp. 10013–10021 (2021)

  46. Wang, X., Zhu, M., Bo, D., Cui, P., Shi, C., Pei, J.: AM-GCN: Adaptive Multi-channel graph convolutional networks. In: ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 1243–1253 (2020)

Download references

Acknowledgements

This research is supported by the National Natural Science Foundation of China (Grant No. 62072288, 61702306), the Taishan Scholar Program of Shandong Province, the Natural Science Foundation of Shandong Province (Grant No. ZR2022MF268).

Funding

This study was funded by the National Natural Science Foundation of China (Grant No. 62072288, 61702306), the Taishan Scholar Program of Shandong Province, the Natural Science Foundation of Shandong Province (Grant No. ZR2022MF268).

Author information

Authors and Affiliations

Authors

Contributions

Chao Li, Zhen Wang, Zhongying Zhao wrote the main manuscript text; Zhen Wang prepared the result of our experiments; All authors reviewed the manuscript.

Corresponding author

Correspondence to Chao Li.

Ethics declarations

Consent for Publication

There is the consent of all authors.

Competing interests

The authors declare that there is no competing interests.

Additional information

Availability of supporting data

The datasets used in the experiments are publicly available in the online repository.

Consent to participate

There is the consent of all authors.

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, C., Wang, Z., Zhao, Z. et al. HGNN-ETA: Heterogeneous graph neural network enriched with text attribute. World Wide Web 26, 1913–1934 (2023). https://doi.org/10.1007/s11280-022-01120-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11280-022-01120-4

Keywords

Navigation