Abstract
In recent years, researchers have shown increasing interest in joint entity and relation extraction. However, existing approaches overlook the interaction between words at different distances and the significance of contextual information between entities. We believe that the correlation strength of word pairs should be considered, and it is necessary to integrate contextual information into entities to learn better entity-level representations. In this paper, we treat named entity recognition as a multi-class classification of word pairs. We employ self-attention mechanism and design both local and multi-grained dilated convolution layers to capture spatial correlations between words. In the relation extraction module, we leverage attention from the self-attention layer to fuse localized context information into entity-pair to produce context-enhanced entity-level representations. In addition, we integrate named entity recognition and relation extraction through a multi-task learning framework, effectively leveraging the interaction between two subtasks. To validate the performance of our model, we conducted extensive experiments on joint entity and relation extraction benchmark datasets CoNLL04, ADE and SciERC. The experimental results indicate that our proposed model can achieve significant improvements over existing methods on these datasets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bordes, A., Chopra, S., Weston, J.: Question answering with subgraph embeddings. In: Proceedings of EMNLP, pp. 615–620 (2014)
Ji, S., Pan, S., Cambria, E., Marttinen, P., Philip, S.Y.: A survey on knowledge graphs: representation, acquisition, and applications. IEEE Trans. Neural Netw. Learn. Syst. 33, 494–514 (2022)
Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Joint entity recognition and relation extraction as a multi-head selection problem. Expert Syst. Appl. 114, 34–45 (2018)
Wang, J., Lu, W.: Two are better than one: joint entity and relation extraction with table-sequence encoders. In: Proceedings of EMNLP, pp. 1706–1721 (2020)
Eberts, M., Ulges, A.: Span-based joint entity and relation extraction with transformer pre-training. In: ECAI, pp. 2006–2013 (2019)
Zhang, M., Zhang, Y., Fu, G.: End-to-end neural relation extraction with global optimization. In: Proceedings of EMNLP, pp. 1730–1740 (2017)
Vaswani, A., et al.: Attention is all you need. In: Proceedings of NeurIPS, pp. 5998–6008 (2017)
Lample, G., Ballesteros, M., Subramanian, S., Kawakami, K., Dyer, C.: Neural architectures for named entity recognition. arXiv preprint arXiv:1603.01360 (2016)
Souza, F., Nogueira, R., Lotufo, R.: Portuguese named entity recognition using BERT-CRF. arXiv preprint arXiv:1909.10649 (2019)
Wang, H., et al.: Extracting multiple-relations in one-pass with pre-trained transformers. In: Proceedings of ACL, pp. 1371–1377 (2019)
Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Adversarial training for multi-context joint entity and relation extraction. In: Proceedings of EMNLP, pp. 2830–2836 (2018)
Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of ACL (Volume 1: Long Papers), pp. 1105–1116 (2016)
Graves, A., Jaitly, N., Mohamed, A.: Hybrid speech recognition with deep bidirectional LSTM. In: IEEE Workshop on Automatic Speech Recognition and Understanding, pp. 273–278 (2013)
Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of ACL-IJCNLP (Volume 1: Long Papers), pp. 1556–1566 (2015)
Miwa, M., Sasaki, Y.: Modeling joint entity and relation extraction with table representation. In: Proceedings of EMNLP, pp. 1858–1869 (2014)
Yan, Z., Zhang, C., Fu, J., Zhang, Q., Wei, Z.: A partition filter network for joint entity and relation extraction. In: Proceedings of EMNLP, pp. 185–197 (2021)
Wang, Y., Sun, C., Wu, Y., Zhou, H., Li, L., Yan, J.: UniRE: a unified label space for entity relation extraction. In: Proceedings of ACL-IJCNLP (Volume 1: Long Papers), pp. 220–231 (2021)
Dixit, K., Al-Onaizan, Y.: Span-level model for relation extraction. In: Proceedings of ACL, pp. 5308–5314 (2019)
Luan, Y., Wadden, D., He, L., Shah, A., Ostendorf, M., Hajishirzi, H.: A general framework for information extraction using dynamic span graphs. In: Proceedings of NAACL-HLT, Volume 1 (Long and Short Papers), pp. 3036–3046 (2019)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186 (2019)
Li, J., et al.: Unified named entity recognition as word-word relation classification. In: Proceedings of AAAI, pp. 10965–10973 (2022)
Hendrycks, D., Gimpel, K.: Gaussian error linear units (GELUs). arXiv preprint arXiv:1606.08415 (2016)
Ba, J.L., Kiros, J.R., Hinton, G.E.: Layer normalization. arXiv preprint arXiv:1607.06450 (2016)
Zhou, W., Huang, K., Ma, T., Huang, J.: Document-level relation extraction with adaptive thresholding and localized context pooling. In: Proceedings of AAAI, pp. 14612–14620 (2021)
Roth, D., Yih, W.T.: A linear programming formulation for global inference in natural language tasks. In: Proceedings of HLT-NAACL, pp. 1–8 (2004)
Gurulingappa, H., Rajput, A.M., Roberts, A., Fluck, J., Hofmann-Apitius, M., Toldo, L.: Development of a benchmark corpus to support the automatic extraction of drug-related adverse effects from medical case reports. J. Biomed. Inform. 45(5), 885–892 (2012)
Luan, Y., He, L., Ostendorf, M., Hajishirzi, H.: Multi-task identification of entities, relations, and coreference for scientific knowledge graph construction. In: Proceedings of EMNLP, pp. 3219–3232 (2018)
Yu, J., Bohnet, B., Poesio, M.: Named entity recognition as dependency parsing. In: Proceedings of ACL, pp. 6470–6476 (2020)
Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. In: Proceedings of EMNLP-IJCNLP, pp. 3615–3620 (2019)
Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: ALBERT: a lite BERT for self-supervised learning of language representations. arXiv preprint arXiv:1909.11942 (2019)
Li, X., et al.: Entity-relation extraction as multi-turn question answering. In: Proceedings of ACL, pp. 1340–1350 (2019)
Chi, R., Wu, B., Hu, L., Zhang, Y.: Enhancing joint entity and relation extraction with language modeling and hierarchical attention. In: APWeb-WAIM, pp. 314–328 (2019)
Li, F., Zhang, M., Fu, G., Ji, D.: A neural joint model for entity and relation extraction from biomedical text. BMC Bioinform. 18(1), 1–11 (2017)
Zhong, Z., Chen, D.: A frustratingly easy approach for entity and relation extraction. In: Proceedings of NAACL-HLT, pp. 50–61 (2021)
Acknowledgement
We thank the anonymous reviewers for their helpful comments and feedback. This work is supported by the National Natural Science Foundation of China (62162060).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Kong, W., Xia, Y., Yao, W., Lu, T. (2023). A Joint Entity and Relation Extraction Approach Using Dilated Convolution and Context Fusion. In: Liu, F., Duan, N., Xu, Q., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2023. Lecture Notes in Computer Science(), vol 14302. Springer, Cham. https://doi.org/10.1007/978-3-031-44693-1_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-44693-1_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-44692-4
Online ISBN: 978-3-031-44693-1
eBook Packages: Computer ScienceComputer Science (R0)