Abstract
Past works have shown knowledge graph embedding (KGE) methods learn from facts in the form of triples and extrapolate to unseen triples. KGE in hyperbolic space can achieve impressive performance even in low-dimensional embedding space. However, existing work limitedly studied extrapolation to under-represented data, including under-represented entities and relations. To this end, we propose HolmE, a general form of KGE method on hyperbolic manifolds. HolmE addresses extrapolation to under-represented entities through a special treatment of the bias term, and extrapolation to under-represented relations by supporting strong composition. We provide empirical evidence that HolmE achieves promising performance in modelling unseen triples, under-represented entities, and under-represented relations. We prove that mainstream KGE methods either: (1) are special cases of HolmE and thus support strong composition; (2) do not support strong composition. The code and data are open-sourced at https://github.com/nsai-uio/HolmE-KGE.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
References
Balažević, I., Allen, C., Hospedales, T.: Multi-relational poincaré graph embeddings. Adv. Neural Inf. Process. Syst. 32 (2019)
Balažević, I., Allen, C., Hospedales, T.: TuckER: tensor factorization for knowledge graph completion. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 5185–5194 (2019)
Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data, vol. 26 (2013)
Cao, E., Wang, D., Huang, J., Hu, W.: Open knowledge enrichment for long-tail entities. In: Proceedings of The Web Conference 2020, pp. 384–394 (2020)
Cao, Z., Xu, Q., Yang, Z., Cao, X., Huang, Q.: Geometry interaction knowledge graph embeddings. In: AAAI Conference on Artificial Intelligence (2022)
Chami, I., Wolf, A., Juan, D.C., Sala, F., Ravi, S., Ré, C.: Low-dimensional hyperbolic knowledge graph embeddings. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 6901–6914 (2020)
Chami, I., Ying, Z., Ré, C., Leskovec, J.: Hyperbolic graph convolutional neural networks. Adv. Neural Inf. Process. Syst. 32 (2019)
Chu, P., Bian, X., Liu, S., Ling, H.: Feature space augmentation for long-tailed data. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12374, pp. 694–710. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58526-6_41
Dai, E., et al.: A comprehensive survey on trustworthy graph neural networks: privacy, robustness, fairness, and explainability. arXiv preprint arXiv:2204.08570 (2022)
Dai Quoc Nguyen, T.D.N., Nguyen, D.Q., Phung, D.: A novel embedding model for knowledge base completion based on convolutional neural network. In: Proceedings of NAACL-HLT, pp. 327–333 (2018)
Dettmers, T., Minervini, P., Stenetorp, P., Riedel, S.: Convolutional 2D knowledge graph embeddings. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)
Djeddi, W.E., Hermi, K., Ben Yahia, S., Diallo, G.: Advancing drug-target interaction prediction: a comprehensive graph-based approach integrating knowledge graph embedding and protbert pretraining. BMC Bioinformatics 24(1), 488 (2023)
Dong, Y., Ma, J., Wang, S., Chen, C., Li, J.: Fairness in graph mining: a survey. IEEE Transactions on Knowledge and Data Engineering (2023)
Ebisu, T., Ichise, R.: Toruse: knowledge graph embedding on a lie group. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
Ganea, O., Bécigneul, G., Hofmann, T.: Hyperbolic neural networks. Adv. Neural Inf. Process. Syst. 31 (2018)
Islam, M.K., Amaya-Ramirez, D., Maigret, B., Devignes, M.D., Aridhi, S., Smaïl-Tabbone, M.: Molecular-evaluated and explainable drug repurposing for COVID-19 using ensemble knowledge graph embedding. Sci. Rep. 13(1), 3643 (2023)
Ji, G., He, S., Xu, L., Liu, K., Zhao, J.: Knowledge graph embedding via dynamic mapping matrix. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (volume 1: Long papers), pp. 687–696 (2015)
Lacroix, T., Usunier, N., Obozinski, G.: Canonical tensor decomposition for knowledge base completion. In: International Conference on Machine Learning, pp. 2863–2872. PMLR (2018)
Li, M., Sun, Z., Zhang, S., Zhang, W.: Enhancing knowledge graph embedding with relational constraints. Neurocomputing 429, 77–88 (2021)
Li, R., et al.: House: knowledge graph embedding with householder parameterization. arXiv preprint arXiv:2202.07919 (2022)
Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)
Liu, X., Zhao, F., Gui, X., Jin, H.: LeKAN: extracting long-tail relations via layer-enhanced knowledge-aggregation networks. In: Bhattacharya, A., et al. (eds.) Database Systems for Advanced Applications. DASFAA 2022. LNCS, vol. 13245, pp. 122–136. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-00123-9_9
Mahdisoltani, F., Biega, J., Suchanek, F.: YAGO3: a knowledge base from multilingual wikipedias. In: 7th Biennial Conference on Innovative Data Systems Research. CIDR Conference (2014)
Pavlović, A., Sallinger, E.: Expressive: a spatio-functional embedding for knowledge graph completion. arXiv preprint arXiv:2206.04192 (2022)
Sun, Z., Deng, Z.H., Nie, J.Y., Tang, J.: RotatE: knowledge graph embedding by relational rotation in complex space. In: International Conference on Learning Representations (2018)
Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality, pp. 57–66 (2015)
Trouillon, T., Welbl, J., Riedel, S., Gaussier, É., Bouchard, G.: Complex embeddings for simple link prediction. In: International Conference on Machine Learning, pp. 2071–2080. PMLR (2016)
Vashishth, S., Sanyal, S., Nitin, V., Talukdar, P.: Composition-based multi-relational graph convolutional networks. arXiv preprint arXiv:1911.03082 (2019)
Wang, S., Huang, X., Chen, C., Wu, L., Li, J.: Reform: error-aware few-shot knowledge graph completion. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 1979–1988 (2021)
Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28 (2014)
Werner, S., Rettinger, A., Halilaj, L., Lüttin, J.: RETRA: recurrent transformers for learning temporally contextualized knowledge graph embeddings. In: Verborgh, R., et al. (eds.) ESWC 2021. LNCS, vol. 12731, pp. 425–440. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-77385-4_25
Willmore, T.J.: An Introduction to Differential Geometry. Courier Corporation, Honolulu (2013)
Xu, Y.W., Zhang, H.J., Cheng, K., Liao, X.L., Zhang, Z.X., Li, Y.B.: Knowledge graph embedding with entity attributes using hypergraph neural networks. Intell. Data Anal. 26(4), 959–975 (2022)
Yang, B., Yih, S.W.t., He, X., Gao, J., Deng, L.: Embedding entities and relations for learning and inference in knowledge bases. In: Proceedings of the International Conference on Learning Representations (ICLR) 2015 (2015)
Zhang, C., Yao, H., Huang, C., Jiang, M., Li, Z., Chawla, N.V.: Few-shot knowledge graph completion. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3041–3048 (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
A Result Details
A Result Details
Correlation between Entity Frequency and Bias Term. Fig. 8 shows that in different data sets and models, there is strong correlation between bias term and entity frequency.
Influence of Entity and Relation Frequency Threshold. Fig. 9 shows that results for different choice of entity and relation frequency threshold does not influence the validity of the observations and claims: (Fig. 9a) models without bias (HolmE, AttH-ub) outperform their counterparts with bias (HolmE-b, AttH), and HolmE outperform other models for triples with under-represented entities; (Fig. 9b) the majority of entities are under-represented entities as long as the threshold is not chosen to be extremely small, considering that the mean of entity frequency is 37.5 and the maximum is 7614. The results are obtained on FB15k-237. HolmE outperforms other models for triples with under-represented relations (Fig. 9c); the majority of relations are under-represented relations as long as the threshold is not chosen to be extremely small (Fig. 9d), considering that the mean of relation frequency is 1148.2 and the maximum is 15989. The results are obtained on FB15k-237.
Link Prediction Results in High Dimension. As expected, embeddings in different spaces achieve similar results in high dimensional space (Table 6), because both Euclidean and hyperbolic spaces become expressive enough to represent complex hierarchies in KGs [6]. Similar to the results in low-dimensional space, HolmE (without bias) is slightly worse than the models with bias (GIE, HolmE-b), although HolmE does not use the advantage of prior probabilities provided by the bias term (Tables 7 and 8).
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Zheng, Z. et al. (2024). Low-Dimensional Hyperbolic Knowledge Graph Embedding for Better Extrapolation to Under-Represented Data. In: Meroño Peñuela, A., et al. The Semantic Web. ESWC 2024. Lecture Notes in Computer Science, vol 14664. Springer, Cham. https://doi.org/10.1007/978-3-031-60626-7_6
Download citation
DOI: https://doi.org/10.1007/978-3-031-60626-7_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-60625-0
Online ISBN: 978-3-031-60626-7
eBook Packages: Computer ScienceComputer Science (R0)