Abstract
Temporal-relation classification plays an important role in the field of natural language processing. Various deep learning-based classifiers, which can generate better models using sentence embedding, have been proposed to address this challenging task. These approaches, however, do not work well because of the lack of task-related information. To overcome this problem, we propose a novel framework that incorporates prior information by employing awareness of events and time expressions (time–event entities) as a filter. We name this module “question encoder.” In our approach, this kind of prior information can extract task-related information from sentence embedding. Our experimental results on a publicly available Timebank-Dense corpus demonstrate that our approach outperforms some state-of-the-art techniques.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chambers, N., Cassidy, T., McDowell, B., Bethard, S.: Dense event ordering with a multi-pass architecture. Trans. Assoc. Comput. Linguist. 2, 273–284 (2014)
Cheng, F., Miyao, Y.: Classifying temporal relations by bidirectional LSTM over dependency paths. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (ACL 2017), pp. 1–6, July 2017
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics, Minneapolis, MN, USA, pp. 4171–4186, June 2019
Dligach, D., Miller, T., Lin, C., Bethard, S., Savova, G.: Neural temporal relation extraction. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics, Valencia, Spain, vol. 2, pp. 746–751, April 2017
Do, H.W., Jeong, Y.S.: Temporal relation classification with deep neural network. In: Proceedings of the 2016 International Conference on Big Data and Smart Computing (BigComp), Hong Kong, China, pp. 454–457, January 2016
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Jin, P., Lian, J., Zhao, X., Wan, S.: TISE: a temporal search engine for web contents. Intell. Inf. Technol. Appl. (2008). https://doi.org/10.1109/IITA.2008.132. 2007 Workshop on 3, 220–224
Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP 2014), pp. 1746–1751 (2014)
Laokulrat, N., Miwa, M., Tsuruoka, Y., Chikayama, T.: UTTime: temporal relation classification using deep syntactic features. Second Joint Conference on Lexical and Computational Semantics. Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation, Atlanta, Georgia, USA, pp. 88–92, June 2013
Li, X., Yin, F., Sun, Z., Li, X., Yuan, A., Chai, D., Zhou, M., Li, J.: Entity-relation extraction as multi-turn question answering. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019), pp. 1340–1350, July 2019
Lin, C., Dligach, D., Miller, T., Bethard, S., Savova, G.: Multilayered temporal modeling for the clinical domain. J. Am. Med. Inform. Assoc. 23, 387–395 (2016)
Lin, C., Miller, T., Dligach, D., Bethard, S., Savova, G.: Representations of time expressions for temporal relation extraction with convolutional neural networks. In: Proceedings of the 2017 Biomedical Natural Language Processing Workshop, pp. 322–327, August 2017. https://doi.org/10.18653/v1/W17-2341
Lin, Z., Feng, M., dos Santos, C.N., Yu, M., Xiang, B., Zhou, B., Bengio, Y.: A structured self-attentive sentence embedding. In: Proceedings of the 5th International Conference on Learning Representations (ICLR 2017) (2017)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. CoRR abs/1301.3781 (2013)
Mirza, P., Tonelli, S.: Classifying temporal relations with simple features. In: Proceedings of the 2014 Conference of the European Chapter of the Association for Computational Linguistics (EACL 2014), pp. 308–317, April 2014
Mirza, P., Tonelli, S.: On the contribution of word embeddings to temporal relation classification. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, pp. 2818–2828. The COLING 2016 Organizing Committee, December 2016
Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP 2014), pp. 1532–1543, October 2014
Tourille, J., Ferret, O., Névéol, A., Tannier, X.: Neural architecture for temporal relation extraction: a bi-LSTM approach for detecting narrative containers. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 224–230, July 2017. https://doi.org/10.18653/v1/P17-2035
Wang, Y., et al.: Clinical information extraction applications: a literature review. J. Biomed. Inform. 77, 34–49 (2018)
Zhang, Y., Li, P., Zhou, G.: Classifying temporal relations between events by deep BiLSTM. In: Proceedings of the 2018 International Conference on Asian Language Processing (IALP 2018), pp. 267–272, November 2018
Zheng, N., et al.: Predicting COVID-19 in China using hybrid AI model. IEEE Trans. Cybern. 50(7), 2891–2904 (2020)
Acknowledgements
This work was partially supported by a JSPS Grant-in-Aid for Scientific Research (B) (#19H04420).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Seki, Y., Zhao, K., Oguni, M., Sugiyama, K. (2020). A Framework for Classifying Temporal Relations with Question Encoder. In: Ishita, E., Pang, N.L.S., Zhou, L. (eds) Digital Libraries at Times of Massive Societal Transition. ICADL 2020. Lecture Notes in Computer Science(), vol 12504. Springer, Cham. https://doi.org/10.1007/978-3-030-64452-9_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-64452-9_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-64451-2
Online ISBN: 978-3-030-64452-9
eBook Packages: Computer ScienceComputer Science (R0)