HARPA: hierarchical attention with relation paths for knowledge graph embedding adversarial learning | Data Mining and Knowledge Discovery Skip to main content
Log in

HARPA: hierarchical attention with relation paths for knowledge graph embedding adversarial learning

  • Published:
Data Mining and Knowledge Discovery Aims and scope Submit manuscript

Abstract

Knowledge graph embedding (KGE) aims to map the knowledge graph into a low-dimensional continuous vector space and provide a unified underlying representation for downstream tasks. Recently, graph neural network (GNN) has been widely used in knowledge graph embedding because of its powerful feature extraction ability, and most KGE models based on GNN use aggregation operations to extract potential information from the triples. Unfortunately, they only emphasize entity embedding and use shallow operations to update relations. As a result, the learning of relation embedding is relatively simple. And they ignore the rich inference information contained in the multi-hop paths. In addition, their complex network structure lacks regularization constraint, which is prone to the over-fitting problem. Therefore, this paper proposes a novel hierarchical attention with relation paths model for knowledge graph embedding adversarial learning (HARPA). HARPA constructs a two-layer attention encoder to learn the information of triples and neighborhoods at the triples-level and further utilizes the rich inference information of paths to deeply learn relation embedding at the paths-level. Besides, HARPA proposes an improved generative adversarial network (GAN) named I-GAN as the regularization term of the model, which imposes constraints on the process of learning embedding and enables the model to learn high-quality and robust embedding. The link prediction experiments on four general knowledge graphs show that the HARPA model outperforms state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Arora S (2020) A survey on graph neural networks for knowledge graph completion. arXiv:2007.12374 [cs.CL]

  • Balazevic I, Allen C, Hospedales TM (2019) Tucker: tensor factorization for knowledge graph completion. In: Inui K, Jiang J, Ng V, et al. (eds) Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing. Association for Computational Linguistics, Hong Kong, China, pp 5184–5193. https://doi.org/10.18653/v1/D19-1522

  • Bordes A, Usunier N, García-Durán A et al (2013) Translating embeddings for modeling multi-relational data. In: Burges CJC, Bottou L, Ghahramani Z et al. (eds) Proceedings of the 26th international conference on neural information processing systems, vol 2. Curran Associates Inc., Lake Tahoe, pp 2787–2795

  • Cai L, Wang WY (2018) KBGAN: adversarial learning for knowledge graph embeddings. In: Proceedings of the 2018 conference of the North American chapter of the association for computational linguistics: human language technologies, vol 1. Association for Computational Linguistics, New Orleans, pp 1470–1480. https://doi.org/10.18653/v1/N18-1133

  • Dai Q, Li Q, Tang J et al (2018) Adversarial network embedding. In: McIlraith SA, Weinberger KQ (eds) Proceedings of the thirty-second AAAI conference on artificial intelligence, the 30th innovative applications of artificial intelligence, and the 8th AAAI symposium on educational advances in artificial intelligence. AAAI Press, New Orleans, pp 2167–2174

  • Dai Y, Wang S, Chen X et al (2020) Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings. Knowl-Based Syst 190(105):165. https://doi.org/10.1016/j.knosys.2019.105165

    Article  Google Scholar 

  • Dettmers T, Minervini P, Stenetorp P et al (2018) Convolutional 2d knowledge graph embeddings. In: McIlraith SA, Weinberger KQ (eds) Proceedings of the Thirty-second AAAI conference on artificial intelligence, the 30th innovative applications of artificial intelligence, and the 8th AAAI symposium on educational advances in artificial intelligence. AAAI Press, New Orleans, pp 1811–1818

  • Ding H, Huang S, Jin W et al (2022) A novel cascade model for end-to-end aspect-based social comment sentiment analysis. Electronics. https://doi.org/10.3390/electronics11121810

    Article  Google Scholar 

  • Ebisu T, Ichise R (2018) Toruse: knowledge graph embedding on a lie group. In: McIlraith SA, Weinberger KQ (eds) Proceedings of the thirty-second AAAI conference on artificial intelligence, the 30th innovative applications of artificial intelligence, and the 8th AAAI symposium on educational advances in artificial intelligence. AAAI Press, New Orleans, pp 1819–1826

  • Fan M, Zhou Q, Chang E et al (2014) Transition-based knowledge graph embedding with relational mapping properties. In: Aroonmanakun W, Boonkwan P, Supnithi T (eds) Proceedings of the 28th Pacific Asia conference on language, information and computing. Department of Linguistics, Chulalongkorn University, Phuket, pp 328–337

  • Goodfellow IJ, Pouget-Abadie J, Mirza M et al (2014) Generative adversarial nets. In: Ghahramani Z, Welling M, Cortes C et al (eds) Advances in neural information processing systems, vol 27. Curran Associates Inc, Montreal, pp 2672–2680

    Google Scholar 

  • Guo L, Sun Z, Hu W (2019) Learning to exploit long-term relational dependencies in knowledge graphs. In: Chaudhuri K, Salakhutdinov R (eds) Proceedings of the 36th international conference on machine learning, ICML 2019, 9–15 June 2019, Long Beach, California, USA, Proceedings of Machine Learning Research, vol 97. PMLR, pp 2505–2514. http://proceedings.mlr.press/v97/guo19c.html

  • Han Y, Fang Q, Hu J et al (2020) GAEAT: graph auto-encoder attention networks for knowledge graph completion. In: d’Aquin M, Dietze S, Hauff C et al (eds) CIKM ’20: the 29th ACM international conference on information and knowledge management. ACM, Virtual Event, Ireland, pp 2053–2056. https://doi.org/10.1145/3340531.3412148

  • He S, Liu K, Ji G et al (2015) Learning to represent knowledge graphs with gaussian embedding. In: Bailey J, Moffat A, Aggarwal CC et al (eds) Proceedings of the 24th ACM international conference on information and knowledge management. ACM, Melbourne, pp 623–632

  • Hong Y, Hwang U, Yoo J et al (2019) How generative adversarial networks and their variants work: an overview. ACM Comput Surv 52(1):10:1-10:43. https://doi.org/10.1145/3301282

    Article  Google Scholar 

  • Huang H, Long G, Shen T et al (2020) Rate: relation-adaptive translating embedding for knowledge graph completion. In: Scott D, Bel N, Zong C (eds) Proceedings of the 28th international conference on computational linguistics. International Committee on Computational Linguistics, Barcelona, pp 556–567. https://doi.org/10.18653/v1/2020.coling-main.48

  • Jin W, Yu H, Luo X (2021a) Cvt-assd: convolutional vision-transformer based attentive single shot multibox detector. In: 2021 IEEE 33rd international conference on tools with artificial intelligence (ICTAI), pp 736–744. https://doi.org/10.1109/ICTAI52525.2021.00117

  • Jin W, Yu H, Tao X et al (2021b) Improving embedded knowledge graph multi-hop question answering by introducing relational chain reasoning. arxiv:2110.12679

  • Kingma DP, Welling M (2014) Auto-encoding variational bayes. In: Bengio Y, LeCun Y (eds) 2nd international conference on learning representations, Banff, Canada

  • Kok S, Domingos PM (2007) Statistical predicate invention. In: Ghahramani Z (ed) Machine learning, proceedings of the twenty-fourth international conference, vol 227. ACM, Corvallis, pp 433–440. https://doi.org/10.1145/1273496.1273551

  • Larsen ABL, Sønderby SK, Larochelle H et al (2016) Autoencoding beyond pixels using a learned similarity metric. In: Balcan M, Weinberger KQ (eds) Proceedings of the 33nd international conference on machine learning, vol 48. JMLR.org, New York City, pp 1558–1566

  • Lin Y, Liu Z, Luan H et al (2015a) Modeling relation paths for representation learning of knowledge bases. In: Màrquez L, Callison-Burch C, Su J et al (eds) Proceedings of the 2015 conference on empirical methods in natural language processing. The Association for Computational Linguistics, Lisbon, pp 705–714. https://doi.org/10.18653/v1/d15-1082

  • Lin Y, Liu Z, Sun M et al (2015b) Learning entity and relation embeddings for knowledge graph completion. In: Bonet B, Koenig S (eds) Proceedings of the twenty-ninth AAAI conference on artificial intelligence. AAAI Press, Austin, pp 2181–2187

  • Lin X, Liang Y, Giunchiglia F et al (2019) Relation path embedding in knowledge graphs. Neural Comput Appl 31(9):5629–5639. https://doi.org/10.1007/s00521-018-3384-6

    Article  Google Scholar 

  • Liu Q, Jiang H, Ling Z et al (2016) Probabilistic reasoning via deep learning: neural association models. arXiv:1603.07704

  • Makhzani A, Shlens J, Jaitly N et al (2015) Adversarial autoencoders. arXiv:1511.05644

  • Nathani D, Chauhan J, Sharma C et al (2019) Learning attention-based embeddings for relation prediction in knowledge graphs. In: Korhonen A, Traum DR, Màrquez L (eds) Proceedings of the 57th conference of the association for computational linguistics, vol 1. Association for Computational Linguistics, Florence, pp 4710–4723. https://doi.org/10.18653/v1/p19-1466

  • Nguyen DQ, Nguyen TD, Nguyen DQ, et al. (2018) A novel embedding model for knowledge base completion based on convolutional neural network. In: Walker MA, Ji H, Stent A (eds) Proceedings of the 2018 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 2. Association for Computational Linguistics, New Orleans, pp 327–333. https://doi.org/10.18653/v1/n18-2053

  • Nguyen DQ, Vu T, Nguyen TD, et al. (2019) A capsule network-based embedding model for knowledge graph completion and search personalization. In: Burstein J, Doran C, Solorio T (eds) Proceedings of the 2019 Conference of the North American chapter of the association for computational linguistics: human language technologies, vol 1. Association for Computational Linguistics, Minneapolis, pp 2180–2189. https://doi.org/10.18653/v1/n19-1226

  • Nickel M, Tresp V, Kriegel H (2011) A three-way model for collective learning on multi-relational data. In: Getoor L, Scheffer T (eds) Proceedings of the 28th international conference on machine learning. Omnipress, Bellevue, pp 809–816

  • Schlichtkrull MS, Kipf TN, Bloem P et al (2018) Modeling relational data with graph convolutional networks. In: Gangemi A, Navigli R, Vidal M et al (eds) The semantic web—15th international conference, vol 10843. Springer, Heraklion, pp 593–607. https://doi.org/10.1007/978-3-319-93417-4_38

  • Seo S, Oh B, Lee K (2020) Reliable knowledge graph path representation learning. IEEE Access 8:32816–32825. https://doi.org/10.1109/ACCESS.2020.2973923

    Article  Google Scholar 

  • Shang C, Tang Y, Huang J et al (2019) End-to-end structure-aware convolutional networks for knowledge base completion. In: The thirty-third AAAI conference on artificial intelligence, the thirty-first innovative applications of artificial intelligence conference, the ninth AAAI symposium on educational advances in artificial intelligence. AAAI Press, Honolulu, pp 3060–3067. https://doi.org/10.1609/aaai.v33i01.33013060

  • Sun Z, Deng Z, Nie J, et al. (2019) Rotate: knowledge graph embedding by relational rotation in complex space. In: 7th international conference on learning representations. OpenReview.net, New Orleans

  • Tao Q, Luo X, Wang H et al (2019) Enhancing relation extraction using syntactic indicators and sentential contexts. In: 2019 IEEE 31st international conference on tools with artificial intelligence (ICTAI), pp 1574–1580. https://doi.org/10.1109/ICTAI.2019.00227

  • Tolstikhin IO, Bousquet O, Gelly S et al (2018) Wasserstein auto-encoders. In: 6th international conference on learning representations. OpenReview.net, Vancouver

  • Toutanova K, Chen D (2015) Observed versus latent features for knowledge base and text inference. In: 3rd workshop on continuous vector space models and their compositionality. ACL—Association for Computational Linguistics. https://www.microsoft.com/en-us/research/publication/observed-versus-latent-features-for-knowledge-base-and-text-inference/

  • Trouillon T, Welbl J, Riedel S et al (2016) Complex embeddings for simple link prediction. In: Balcan M, Weinberger KQ (eds) Proceedings of the 33nd international conference on machine learning, vol 48. JMLR.org, New York City, pp 2071–2080

  • Vashishth S, Sanyal S, Nitin V et al (2020a) Interacte: improving convolution-based knowledge graph embeddings by increasing feature interactions. In: Press AAAI (ed) The thirty-fourth AAAI conference on artificial intelligence, the thirty-second innovative applications of artificial intelligence conference, the tenth AAAI symposium on educational advances in artificial intelligence. New York, NY, USA, pp 3009–3016

  • Vashishth S, Sanyal S, Nitin V et al (2020b) Composition-based multi-relational graph convolutional networks. In: 8th international conference on learning representations. OpenReview.net, Addis Ababa

  • Velickovic P, Cucurull G, Casanova A et al (2018) Graph attention networks. In: 6th international conference on learning representations. OpenReview.net, Vancouver

  • Wang Z, Zhang J, Feng J et al (2014) Knowledge graph embedding by translating on hyperplanes. In: Brodley CE, Stone P (eds) Proceedings of the twenty-eighth AAAI conference on artificial intelligence. AAAI Press, Québec City,, pp 1112–1119

  • Wang Q, Mao Z, Wang B et al (2017) Knowledge graph embedding: a survey of approaches and applications. IEEE Trans Knowl Data Eng 29(12):2724–2743. https://doi.org/10.1109/TKDE.2017.2754499

    Article  Google Scholar 

  • Wang P, Li S, Pan R (2018) Incorporating gan for negative sampling in knowledge representation learning. In: Proceedings of the thirty-second AAAI conference on artificial intelligence, the 30th innovative applications of artificial intelligence, and the 8th AAAI symposium on educational advances in artificial intelligence. AAAI Press, New Orleans, pp 2005–2012

  • Xiao H, Huang M, Zhu X (2016a) From one point to a manifold: knowledge graph embedding for precise link prediction. In: Kambhampati S (ed) Proceedings of the twenty-fifth international joint conference on artificial intelligence. IJCAI/AAAI Press, New York, pp 1315–1321

  • Xiao H, Huang M, Zhu X (2016b) TransG : a generative model for knowledge graph embedding. In: Proceedings of the 54th annual meeting of the association for computational linguistics, vol 1. The Association for Computer Linguistics, Berlin, pp 2316–2325

  • Yang B, Yih W, He X et al (2015) Embedding entities and relations for learning and inference in knowledge bases. In: Bengio Y, LeCun Y (eds) 3rd international conference on learning representations, San Diego, CA, USA

  • Yang Y, Agrawal D, Jagadish HV et al (2019) An efficient parallel keyword search engine on knowledge graphs. In: 35th IEEE international conference on data engineering, ICDE 2019, Macao, China, April 8–11, 2019. IEEE, pp 338–349. https://doi.org/10.1109/ICDE.2019.00038

  • Zhang S, Tay Y, Yao L et al (2019) Quaternion knowledge graph embeddings. In: Wallach HM, Larochelle H, Beygelzimer A et al (eds) Advances in neural information processing systems 32: annual conference on neural information processing systems 2019. Vancouver, BC, Canada, pp 2731–2741

  • Zhang Z, Cai J, Wang J (2020a) Duality-induced regularizer for tensor factorization based knowledge graph completion. In: Larochelle H, Ranzato M, Hadsell R et al (eds) Advances in neural information processing systems 33: annual conference on neural information processing systems 2020, virtual

  • Zhang Z, Zhuang F, Zhu H et al (2020b) Relational graph neural network with hierarchical attention for knowledge graph completion. In: The thirty-fourth AAAI conference on artificial intelligence, the thirty-second innovative applications of artificial intelligence conference, the tenth AAAI symposium on educational advances in artificial intelligence. AAAI Press, New York, pp 9612–9619. https://aaai.org/ojs/index.php/AAAI/article/view/6508

  • Zhao J, Kim Y, Zhang K et al (2018) Adversarially regularized autoencoders. In: Dy J, Krause A (eds) Proceedings of the 35th international conference on machine learning, proceedings of machine learning research, vol 80. PMLR, pp 5902–5911. https://proceedings.mlr.press/v80/zhao18b.html

  • Zhu Y, Liu H, Wu Z, et al. (2019) Representation learning with ordered relation paths for knowledge graph completion. In: Inui K, Jiang J, Ng V et al (eds) Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing. Association for Computational Linguistics, Hong Kong, China, pp 2662–2671. https://doi.org/10.18653/v1/D19-1268

Download references

Acknowledgements

This work was supported by National Key R &D Program of China (2019YFC1711000) and Collaborative Innovation Center of Novel Software Technology and Industrialization.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jieyue He.

Additional information

Communicated by Tim Weninger.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, N., Wang, J. & He, J. HARPA: hierarchical attention with relation paths for knowledge graph embedding adversarial learning. Data Min Knowl Disc 37, 521–551 (2023). https://doi.org/10.1007/s10618-022-00888-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10618-022-00888-3

Keywords

Navigation