Abstract
Entity disambiguation (ED) is a critical natural language processing (NLP) task that involves identifying and linking entity mentions in the text to their corresponding real-world entities in reference knowledge graphs (KGs). Most existing efforts perform ED by firstly learning the representations of mention and candidate entities using a variety of features and subsequently assessing the compatibility between mention and candidate entities as well as the coherence between entities. Despite advancements in the field, the limited textual descriptions of mentions and entities still lead to semantic ambiguity, resulting in sub-optimal performance for the entity disambiguation task. In this work, we propose a novel framework LogicED, which considers both Local and global semantics for contrastive Entity Disambiguation. Specifically, we design a local contextual module, which utilizes a candidate-aware self-attention (CASA) model and the contrastive learning strategy, to learn robust and discriminative contextual embeddings for both mentions and candidate entities. Furthermore, we propose a global semantic graph module that takes into account both the local mention-entity compatibility and the global entity-entity coherence to optimize the entity disambiguation from a global perspective. Extensive experiments on benchmark datasets demonstrate that our proposed framework surpasses the state-of-the-art baselines.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Barba, E., Procopio, L., Navigli, R.: Extend: extractive entity disambiguation. In: ACL (1), pp. 2478–2488 (2022)
Bevilacqua, M., Blloshmi, R., Navigli, R.: One SPRING to rule them both: symmetric AMR semantic parsing and generation without a complex pipeline. In: AAAI, pp. 12564–12573 (2021)
Botha, J.A., Shan, Z., Gillick, D.: Entity linking in 100 languages. In: EMNLP (1), pp. 7833–7845 (2020)
Cao, N.D., Izacard, G., Riedel, S., Petroni, F.: Autoregressive entity retrieval. In: ICLR (2021)
Cao, N.D., et al.: Multilingual autoregressive entity linking. Trans. Assoc. Comput. Linguistics 10, 274–290 (2022)
Chen, L., Varoquaux, G., Suchanek, F.M.: A lightweight neural model for biomedical entity linking. In: AAAI, pp. 12657–12665 (2021)
Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT (1), pp. 4171–4186 (2019)
Fang, W., Zhang, J., Wang, D., Chen, Z., Li, M.: Entity disambiguation by knowledge and text jointly embedding. In: CoNLL, pp. 260–269 (2016)
Fang, Z., Cao, Y., Li, Q., Zhang, D., Zhang, Z., Liu, Y.: Joint entity linking with deep reinforcement learning. In: WWW, pp. 438–447 (2019)
Ganea, O., Hofmann, T.: Deep joint entity disambiguation with local neural attention. In: EMNLP, pp. 2619–2629 (2017)
Guo, Z., Barbosa, D.: Robust named entity disambiguation with random walks. Semant. Web 9(4), 459–479 (2018)
Hoffart, J., et al.: Robust disambiguation of named entities in text. In: EMNLP, pp. 782–792 (2011)
Humeau, S., Shuster, K., Lachaux, M., Weston, J.: Poly-encoders: architectures and pre-training strategies for fast and accurate multi-sentence scoring. In: ICLR (2020)
Kolitsas, N., Ganea, O., Hofmann, T.: End-to-end neural entity linking. In: CoNLL, pp. 519–529 (2018)
Le, P., Titov, I.: Improving entity linking by modeling latent relations between mentions. In: ACL (1), pp. 1595–1604 (2018)
Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. In: ICLR (Workshop Poster) (2013)
Onoe, Y., Durrett, G.: Fine-grained entity typing for domain independent entity linking. In: AAAI, pp. 8576–8583 (2020)
Pennington, J., Socher, R., Manning, C.D.: GloVe: global vectors for word representation. In: EMNLP, pp. 1532–1543 (2014)
Pershina, M., He, Y., Grishman, R.: Personalized page rank for named entity disambiguation. In: HLT-NAACL, pp. 238–243 (2015)
Puduppully, R., Dong, L., Lapata, M.: Data-to-text generation with entity modeling. In: ACL (1), pp. 2023–2035 (2019)
Rao, D., McNamee, P., Dredze, M.: Entity linking: finding extracted entities in a knowledge base. In: Multi-source, Multilingual Information Extraction and Summarization, pp. 93–115. Theory and Applications of Natural Language Processing (2013)
Shahbazi, H., Fern, X.Z., Ghaeini, R., Obeidat, R., Tadepalli, P.: Entity-aware ELMo: learning contextual entity representation for entity disambiguation. arXiv preprint arXiv:1908.05762 (2019)
Sil, A., Yates, A.: Re-ranking for joint named-entity recognition and linking. In: CIKM, pp. 2369–2374 (2013)
Sun, Y., Lin, L., Tang, D., Yang, N., Ji, Z., Wang, X.: Modeling mention, context and entity with neural networks for entity disambiguation. In: IJCAI, pp. 1333–1339 (2015)
Tedeschi, S., Conia, S., Cecconi, F., Navigli, R.: Named entity recognition for entity linking: what works and what’s next. In: EMNLP (Findings), pp. 2584–2596 (2021)
Vaswani, A., et al.: Attention is all you need. In: NIPS, pp. 5998–6008 (2017)
Xue, M., et al.: Neural collective entity linking based on recurrent random walk network learning. In: IJCAI, pp. 5327–5333 (2019)
Yamada, I., Asai, A., Shindo, H., Takeda, H., Matsumoto, Y.: LUKE: deep contextualized entity representations with entity-aware self-attention. In: EMNLP (1), pp. 6442–6454 (2020)
Yamada, I., Washio, K., Shindo, H., Matsumoto, Y.: Global entity disambiguation with BERT. In: NAACL-HLT, pp. 3264–3271 (2022)
Yang, X., et al.: Learning dynamic context augmentation for global entity linking. In: EMNLP/IJCNLP (1), pp. 271–281 (2019)
Yang, Y., Irsoy, O., Rahman, K.S.: Collective entity disambiguation with structured gradient tree boosting. In: NAACL-HLT, pp. 777–786 (2018)
Yin, W., Yu, M., Xiang, B., Zhou, B., Schütze, H.: Simple question answering by attentive convolutional neural network. In: COLING, pp. 1746–1756 (2016)
Acknowledgement
This work was supported in part by the Guangdong Provincial Key Laboratory of Interdisciplinary Research and Application for Data Science, BNU-HKBU United International College (2022B1212010006) and in part by Guangdong Higher Education Upgrading Plan (2021–2025) (UICR0400001-22, UICR0400017-21, UICR0400003-21).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Ke, Y., Xue, S., Chen, Z., Meng, R. (2023). Joint Modeling of Local and Global Semantics for Contrastive Entity Disambiguation. In: Yang, X., et al. Advanced Data Mining and Applications. ADMA 2023. Lecture Notes in Computer Science(), vol 14176. Springer, Cham. https://doi.org/10.1007/978-3-031-46661-8_17
Download citation
DOI: https://doi.org/10.1007/978-3-031-46661-8_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-46660-1
Online ISBN: 978-3-031-46661-8
eBook Packages: Computer ScienceComputer Science (R0)