Low-Dimensional Hyperbolic Knowledge Graph Embedding for Better Extrapolation to Under-Represented Data | SpringerLink
Skip to main content

Low-Dimensional Hyperbolic Knowledge Graph Embedding for Better Extrapolation to Under-Represented Data

  • Conference paper
  • First Online:
The Semantic Web (ESWC 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14664))

Included in the following conference series:

Abstract

Past works have shown knowledge graph embedding (KGE) methods learn from facts in the form of triples and extrapolate to unseen triples. KGE in hyperbolic space can achieve impressive performance even in low-dimensional embedding space. However, existing work limitedly studied extrapolation to under-represented data, including under-represented entities and relations. To this end, we propose HolmE, a general form of KGE method on hyperbolic manifolds. HolmE addresses extrapolation to under-represented entities through a special treatment of the bias term, and extrapolation to under-represented relations by supporting strong composition. We provide empirical evidence that HolmE achieves promising performance in modelling unseen triples, under-represented entities, and under-represented relations. We prove that mainstream KGE methods either: (1) are special cases of HolmE and thus support strong composition; (2) do not support strong composition. The code and data are open-sourced at https://github.com/nsai-uio/HolmE-KGE.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 7549
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 9437
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Proof see https://github.com/nsai-uio/HolmE-KGE/blob/main/Proof.pdf.

  2. 2.

    Proof see https://github.com/nsai-uio/HolmE-KGE/blob/main/Proof.pdf..

References

  1. Balažević, I., Allen, C., Hospedales, T.: Multi-relational poincaré graph embeddings. Adv. Neural Inf. Process. Syst. 32 (2019)

    Google Scholar 

  2. Balažević, I., Allen, C., Hospedales, T.: TuckER: tensor factorization for knowledge graph completion. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 5185–5194 (2019)

    Google Scholar 

  3. Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data, vol. 26 (2013)

    Google Scholar 

  4. Cao, E., Wang, D., Huang, J., Hu, W.: Open knowledge enrichment for long-tail entities. In: Proceedings of The Web Conference 2020, pp. 384–394 (2020)

    Google Scholar 

  5. Cao, Z., Xu, Q., Yang, Z., Cao, X., Huang, Q.: Geometry interaction knowledge graph embeddings. In: AAAI Conference on Artificial Intelligence (2022)

    Google Scholar 

  6. Chami, I., Wolf, A., Juan, D.C., Sala, F., Ravi, S., Ré, C.: Low-dimensional hyperbolic knowledge graph embeddings. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 6901–6914 (2020)

    Google Scholar 

  7. Chami, I., Ying, Z., Ré, C., Leskovec, J.: Hyperbolic graph convolutional neural networks. Adv. Neural Inf. Process. Syst. 32 (2019)

    Google Scholar 

  8. Chu, P., Bian, X., Liu, S., Ling, H.: Feature space augmentation for long-tailed data. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12374, pp. 694–710. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58526-6_41

    Chapter  Google Scholar 

  9. Dai, E., et al.: A comprehensive survey on trustworthy graph neural networks: privacy, robustness, fairness, and explainability. arXiv preprint arXiv:2204.08570 (2022)

  10. Dai Quoc Nguyen, T.D.N., Nguyen, D.Q., Phung, D.: A novel embedding model for knowledge base completion based on convolutional neural network. In: Proceedings of NAACL-HLT, pp. 327–333 (2018)

    Google Scholar 

  11. Dettmers, T., Minervini, P., Stenetorp, P., Riedel, S.: Convolutional 2D knowledge graph embeddings. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)

    Google Scholar 

  12. Djeddi, W.E., Hermi, K., Ben Yahia, S., Diallo, G.: Advancing drug-target interaction prediction: a comprehensive graph-based approach integrating knowledge graph embedding and protbert pretraining. BMC Bioinformatics 24(1), 488 (2023)

    Article  Google Scholar 

  13. Dong, Y., Ma, J., Wang, S., Chen, C., Li, J.: Fairness in graph mining: a survey. IEEE Transactions on Knowledge and Data Engineering (2023)

    Google Scholar 

  14. Ebisu, T., Ichise, R.: Toruse: knowledge graph embedding on a lie group. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)

    Google Scholar 

  15. Ganea, O., Bécigneul, G., Hofmann, T.: Hyperbolic neural networks. Adv. Neural Inf. Process. Syst. 31 (2018)

    Google Scholar 

  16. Islam, M.K., Amaya-Ramirez, D., Maigret, B., Devignes, M.D., Aridhi, S., Smaïl-Tabbone, M.: Molecular-evaluated and explainable drug repurposing for COVID-19 using ensemble knowledge graph embedding. Sci. Rep. 13(1), 3643 (2023)

    Article  Google Scholar 

  17. Ji, G., He, S., Xu, L., Liu, K., Zhao, J.: Knowledge graph embedding via dynamic mapping matrix. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (volume 1: Long papers), pp. 687–696 (2015)

    Google Scholar 

  18. Lacroix, T., Usunier, N., Obozinski, G.: Canonical tensor decomposition for knowledge base completion. In: International Conference on Machine Learning, pp. 2863–2872. PMLR (2018)

    Google Scholar 

  19. Li, M., Sun, Z., Zhang, S., Zhang, W.: Enhancing knowledge graph embedding with relational constraints. Neurocomputing 429, 77–88 (2021)

    Article  Google Scholar 

  20. Li, R., et al.: House: knowledge graph embedding with householder parameterization. arXiv preprint arXiv:2202.07919 (2022)

  21. Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)

    Google Scholar 

  22. Liu, X., Zhao, F., Gui, X., Jin, H.: LeKAN: extracting long-tail relations via layer-enhanced knowledge-aggregation networks. In: Bhattacharya, A., et al. (eds.) Database Systems for Advanced Applications. DASFAA 2022. LNCS, vol. 13245, pp. 122–136. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-00123-9_9

  23. Mahdisoltani, F., Biega, J., Suchanek, F.: YAGO3: a knowledge base from multilingual wikipedias. In: 7th Biennial Conference on Innovative Data Systems Research. CIDR Conference (2014)

    Google Scholar 

  24. Pavlović, A., Sallinger, E.: Expressive: a spatio-functional embedding for knowledge graph completion. arXiv preprint arXiv:2206.04192 (2022)

  25. Sun, Z., Deng, Z.H., Nie, J.Y., Tang, J.: RotatE: knowledge graph embedding by relational rotation in complex space. In: International Conference on Learning Representations (2018)

    Google Scholar 

  26. Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality, pp. 57–66 (2015)

    Google Scholar 

  27. Trouillon, T., Welbl, J., Riedel, S., Gaussier, É., Bouchard, G.: Complex embeddings for simple link prediction. In: International Conference on Machine Learning, pp. 2071–2080. PMLR (2016)

    Google Scholar 

  28. Vashishth, S., Sanyal, S., Nitin, V., Talukdar, P.: Composition-based multi-relational graph convolutional networks. arXiv preprint arXiv:1911.03082 (2019)

  29. Wang, S., Huang, X., Chen, C., Wu, L., Li, J.: Reform: error-aware few-shot knowledge graph completion. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 1979–1988 (2021)

    Google Scholar 

  30. Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28 (2014)

    Google Scholar 

  31. Werner, S., Rettinger, A., Halilaj, L., Lüttin, J.: RETRA: recurrent transformers for learning temporally contextualized knowledge graph embeddings. In: Verborgh, R., et al. (eds.) ESWC 2021. LNCS, vol. 12731, pp. 425–440. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-77385-4_25

    Chapter  Google Scholar 

  32. Willmore, T.J.: An Introduction to Differential Geometry. Courier Corporation, Honolulu (2013)

    Google Scholar 

  33. Xu, Y.W., Zhang, H.J., Cheng, K., Liao, X.L., Zhang, Z.X., Li, Y.B.: Knowledge graph embedding with entity attributes using hypergraph neural networks. Intell. Data Anal. 26(4), 959–975 (2022)

    Article  Google Scholar 

  34. Yang, B., Yih, S.W.t., He, X., Gao, J., Deng, L.: Embedding entities and relations for learning and inference in knowledge bases. In: Proceedings of the International Conference on Learning Representations (ICLR) 2015 (2015)

    Google Scholar 

  35. Zhang, C., Yao, H., Huang, C., Jiang, M., Li, Z., Chawla, N.V.: Few-shot knowledge graph completion. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3041–3048 (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhuoxun Zheng .

Editor information

Editors and Affiliations

A Result Details

A Result Details

Correlation between Entity Frequency and Bias Term. Fig. 8 shows that in different data sets and models, there is strong correlation between bias term and entity frequency.

Fig. 8.
figure 8

Correlation between bias term and entity frequency. a,b: Atth and HolmE in FB15k-237 respectively; c,d: Atth and HolmE in WN18RR respectively.

Influence of Entity and Relation Frequency Threshold. Fig. 9 shows that results for different choice of entity and relation frequency threshold does not influence the validity of the observations and claims: (Fig. 9a) models without bias (HolmE, AttH-ub) outperform their counterparts with bias (HolmE-b, AttH), and HolmE outperform other models for triples with under-represented entities; (Fig. 9b) the majority of entities are under-represented entities as long as the threshold is not chosen to be extremely small, considering that the mean of entity frequency is 37.5 and the maximum is 7614. The results are obtained on FB15k-237. HolmE outperforms other models for triples with under-represented relations (Fig. 9c); the majority of relations are under-represented relations as long as the threshold is not chosen to be extremely small (Fig. 9d), considering that the mean of relation frequency is 1148.2 and the maximum is 15989. The results are obtained on FB15k-237.

Fig. 9.
figure 9

The choice of threshold does not influence the observations or claims

Table 6. HolmE achieves comparable results as SotA hyperbolic KGE in high dimensional embedding space. Best score in bold and second best underlined. Sources are indicated by citations, or reproduced by us with open source code.

Link Prediction Results in High Dimension. As expected, embeddings in different spaces achieve similar results in high dimensional space (Table 6), because both Euclidean and hyperbolic spaces become expressive enough to represent complex hierarchies in KGs [6]. Similar to the results in low-dimensional space, HolmE (without bias) is slightly worse than the models with bias (GIE, HolmE-b), although HolmE does not use the advantage of prior probabilities provided by the bias term (Tables 7 and 8).

Table 7. Detailed analysis for relation patterns on FB15k-237. Sym. : Symmetry, Asym.: ASymmetry, Inv.: Inversion, Tran.: Transitivity, Comp.: Composition.
Table 8. Detailed analysis for relation mapping properties in low-dimensional space on FB15k-237.

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zheng, Z. et al. (2024). Low-Dimensional Hyperbolic Knowledge Graph Embedding for Better Extrapolation to Under-Represented Data. In: Meroño Peñuela, A., et al. The Semantic Web. ESWC 2024. Lecture Notes in Computer Science, vol 14664. Springer, Cham. https://doi.org/10.1007/978-3-031-60626-7_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-60626-7_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-60625-0

  • Online ISBN: 978-3-031-60626-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics