Joint Modeling of Local and Global Semantics for Contrastive Entity Disambiguation | SpringerLink
Skip to main content

Joint Modeling of Local and Global Semantics for Contrastive Entity Disambiguation

  • Conference paper
  • First Online:
Advanced Data Mining and Applications (ADMA 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14176))

Included in the following conference series:

  • 1085 Accesses

Abstract

Entity disambiguation (ED) is a critical natural language processing (NLP) task that involves identifying and linking entity mentions in the text to their corresponding real-world entities in reference knowledge graphs (KGs). Most existing efforts perform ED by firstly learning the representations of mention and candidate entities using a variety of features and subsequently assessing the compatibility between mention and candidate entities as well as the coherence between entities. Despite advancements in the field, the limited textual descriptions of mentions and entities still lead to semantic ambiguity, resulting in sub-optimal performance for the entity disambiguation task. In this work, we propose a novel framework LogicED, which considers both Local and global semantics for contrastive Entity Disambiguation. Specifically, we design a local contextual module, which utilizes a candidate-aware self-attention (CASA) model and the contrastive learning strategy, to learn robust and discriminative contextual embeddings for both mentions and candidate entities. Furthermore, we propose a global semantic graph module that takes into account both the local mention-entity compatibility and the global entity-entity coherence to optimize the entity disambiguation from a global perspective. Extensive experiments on benchmark datasets demonstrate that our proposed framework surpasses the state-of-the-art baselines.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11210
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14013
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/wikimedia/mediawiki.

References

  1. Barba, E., Procopio, L., Navigli, R.: Extend: extractive entity disambiguation. In: ACL (1), pp. 2478–2488 (2022)

    Google Scholar 

  2. Bevilacqua, M., Blloshmi, R., Navigli, R.: One SPRING to rule them both: symmetric AMR semantic parsing and generation without a complex pipeline. In: AAAI, pp. 12564–12573 (2021)

    Google Scholar 

  3. Botha, J.A., Shan, Z., Gillick, D.: Entity linking in 100 languages. In: EMNLP (1), pp. 7833–7845 (2020)

    Google Scholar 

  4. Cao, N.D., Izacard, G., Riedel, S., Petroni, F.: Autoregressive entity retrieval. In: ICLR (2021)

    Google Scholar 

  5. Cao, N.D., et al.: Multilingual autoregressive entity linking. Trans. Assoc. Comput. Linguistics 10, 274–290 (2022)

    Article  Google Scholar 

  6. Chen, L., Varoquaux, G., Suchanek, F.M.: A lightweight neural model for biomedical entity linking. In: AAAI, pp. 12657–12665 (2021)

    Google Scholar 

  7. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT (1), pp. 4171–4186 (2019)

    Google Scholar 

  8. Fang, W., Zhang, J., Wang, D., Chen, Z., Li, M.: Entity disambiguation by knowledge and text jointly embedding. In: CoNLL, pp. 260–269 (2016)

    Google Scholar 

  9. Fang, Z., Cao, Y., Li, Q., Zhang, D., Zhang, Z., Liu, Y.: Joint entity linking with deep reinforcement learning. In: WWW, pp. 438–447 (2019)

    Google Scholar 

  10. Ganea, O., Hofmann, T.: Deep joint entity disambiguation with local neural attention. In: EMNLP, pp. 2619–2629 (2017)

    Google Scholar 

  11. Guo, Z., Barbosa, D.: Robust named entity disambiguation with random walks. Semant. Web 9(4), 459–479 (2018)

    Article  Google Scholar 

  12. Hoffart, J., et al.: Robust disambiguation of named entities in text. In: EMNLP, pp. 782–792 (2011)

    Google Scholar 

  13. Humeau, S., Shuster, K., Lachaux, M., Weston, J.: Poly-encoders: architectures and pre-training strategies for fast and accurate multi-sentence scoring. In: ICLR (2020)

    Google Scholar 

  14. Kolitsas, N., Ganea, O., Hofmann, T.: End-to-end neural entity linking. In: CoNLL, pp. 519–529 (2018)

    Google Scholar 

  15. Le, P., Titov, I.: Improving entity linking by modeling latent relations between mentions. In: ACL (1), pp. 1595–1604 (2018)

    Google Scholar 

  16. Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)

  17. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. In: ICLR (Workshop Poster) (2013)

    Google Scholar 

  18. Onoe, Y., Durrett, G.: Fine-grained entity typing for domain independent entity linking. In: AAAI, pp. 8576–8583 (2020)

    Google Scholar 

  19. Pennington, J., Socher, R., Manning, C.D.: GloVe: global vectors for word representation. In: EMNLP, pp. 1532–1543 (2014)

    Google Scholar 

  20. Pershina, M., He, Y., Grishman, R.: Personalized page rank for named entity disambiguation. In: HLT-NAACL, pp. 238–243 (2015)

    Google Scholar 

  21. Puduppully, R., Dong, L., Lapata, M.: Data-to-text generation with entity modeling. In: ACL (1), pp. 2023–2035 (2019)

    Google Scholar 

  22. Rao, D., McNamee, P., Dredze, M.: Entity linking: finding extracted entities in a knowledge base. In: Multi-source, Multilingual Information Extraction and Summarization, pp. 93–115. Theory and Applications of Natural Language Processing (2013)

    Google Scholar 

  23. Shahbazi, H., Fern, X.Z., Ghaeini, R., Obeidat, R., Tadepalli, P.: Entity-aware ELMo: learning contextual entity representation for entity disambiguation. arXiv preprint arXiv:1908.05762 (2019)

  24. Sil, A., Yates, A.: Re-ranking for joint named-entity recognition and linking. In: CIKM, pp. 2369–2374 (2013)

    Google Scholar 

  25. Sun, Y., Lin, L., Tang, D., Yang, N., Ji, Z., Wang, X.: Modeling mention, context and entity with neural networks for entity disambiguation. In: IJCAI, pp. 1333–1339 (2015)

    Google Scholar 

  26. Tedeschi, S., Conia, S., Cecconi, F., Navigli, R.: Named entity recognition for entity linking: what works and what’s next. In: EMNLP (Findings), pp. 2584–2596 (2021)

    Google Scholar 

  27. Vaswani, A., et al.: Attention is all you need. In: NIPS, pp. 5998–6008 (2017)

    Google Scholar 

  28. Xue, M., et al.: Neural collective entity linking based on recurrent random walk network learning. In: IJCAI, pp. 5327–5333 (2019)

    Google Scholar 

  29. Yamada, I., Asai, A., Shindo, H., Takeda, H., Matsumoto, Y.: LUKE: deep contextualized entity representations with entity-aware self-attention. In: EMNLP (1), pp. 6442–6454 (2020)

    Google Scholar 

  30. Yamada, I., Washio, K., Shindo, H., Matsumoto, Y.: Global entity disambiguation with BERT. In: NAACL-HLT, pp. 3264–3271 (2022)

    Google Scholar 

  31. Yang, X., et al.: Learning dynamic context augmentation for global entity linking. In: EMNLP/IJCNLP (1), pp. 271–281 (2019)

    Google Scholar 

  32. Yang, Y., Irsoy, O., Rahman, K.S.: Collective entity disambiguation with structured gradient tree boosting. In: NAACL-HLT, pp. 777–786 (2018)

    Google Scholar 

  33. Yin, W., Yu, M., Xiang, B., Zhou, B., Schütze, H.: Simple question answering by attentive convolutional neural network. In: COLING, pp. 1746–1756 (2016)

    Google Scholar 

Download references

Acknowledgement

This work was supported in part by the Guangdong Provincial Key Laboratory of Interdisciplinary Research and Application for Data Science, BNU-HKBU United International College (2022B1212010006) and in part by Guangdong Higher Education Upgrading Plan (2021–2025) (UICR0400001-22, UICR0400017-21, UICR0400003-21).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rui Meng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ke, Y., Xue, S., Chen, Z., Meng, R. (2023). Joint Modeling of Local and Global Semantics for Contrastive Entity Disambiguation. In: Yang, X., et al. Advanced Data Mining and Applications. ADMA 2023. Lecture Notes in Computer Science(), vol 14176. Springer, Cham. https://doi.org/10.1007/978-3-031-46661-8_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46661-8_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46660-1

  • Online ISBN: 978-3-031-46661-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics