Abstract
Relation classification is an important task in natural language processing (NLP) fields. State-of-the-art methods are mainly based on deep neural networks. This paper proposes a bi-channel tree convolution based neural network model, BiTCNN, which combines syntactic tree features and other lexical level features together in a deeper manner for relation classification. First, each input sentence is parsed into a syntactic tree. Then, this tree is decomposed into two sub-tree sequences with top-down decomposition strategy and bottom-up decomposition strategy. Each sub-tree represents a suitable semantic fragment in the input sentence and is converted into a real-valued vector. Then these vectors are fed into a bi-channel convolutional neural network model and the convolution operations re performed on them. Finally, the outputs of the bi-channel convolution operations are combined together and fed into a series of linear transformation operations to get the final relation classification result. Our method integrates syntactic tree features and convolutional neural network architecture together and elaborates their advantages fully. The proposed method is evaluated on the SemEval 2010 data set. Extensive experiments show that our method achieves better relation classification results compared with other state-of-the-art methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)
Wang, L., Cao, Z., de Melo, G., Liu, Z.: Relation classification via multi-level attention CNNs. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 1298–1307 (2016)
Cai, R., Zhang, X., Wang, H.: Bidirectional recurrent convolutional neural network for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 756–765 (2016)
Xu, K., Feng, Y., Huang, S., Zhao, D.: Semantic relation classification via convolutional neural networks with simple negative sampling. In: Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing, pp. 536–540 (2015a)
Xu, Y., Mou, L., Li, G., Chen, Y., Peng, H., Jin, Z.: Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1785–1794 (2015b)
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of the 25th International Conference on Computational Linguistics, pp. 2335–2344 (2014)
Zhang, Z., Zhao, H., Qin, L.: Probabilistic graph-based dependency parsing with convolutional neural network. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 1382–1392 (2016)
Socher, R., Bauer, J., Manning, C.D., Ng, A.Y.: Parsing with compositional vector grammars. In: Proceedings of the 51th Annual Meeting of the Association for Computational Linguistics, pp. 455–465 (2013a)
Socher, R., Perelygin, A., Wu, J.Y., Chuang, J., Manning, C.D., Ng, A.Y., Potts, C.: Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1631–1642 (2013b)
dos Santos, C.N., Xiang, B., Zhou, B.: Classifying relations by ranking with convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics, pp. 626–634 (2015)
Liu, Y., Wei, F., Li, S., Ji, H., Zhou, M., Wang, H.: A dependency-based neural network for relation classification. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics, pp: 285–290 (2015)
Vu, N.T., Adel, H., Gupta, P., Schutze, H.: Combining recurrent and convolutional neural networks for relation classification. In: Proceedings of NAACL-HLT 2016, pp. 534–539 (2015)
Hashimoto, K., Miwa, M., Tsuruoka, Y., Chikayama, T.: Simple customization of recursive neural networks for semantic relation classification. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1372–1376 (2013)
Socher, R., Huval, B., Manning, C.D., Ng, A.Y.: Semantic compositionality through recursive matrix-vector spaces. In: Proceedings of the 2012 Joint Conference on EMNLP and Computational Natural Language Learning, pp. 1201–1211 (2012)
Acknowledgements
This work is supported by the National Natural Science Foundation of China (NSFC No. 61572120, 61672138 and 61432013).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Ren, F., Li, Y., Zhao, R., Zhou, D., Liu, Z. (2018). BiTCNN: A Bi-Channel Tree Convolution Based Neural Network Model for Relation Classification. In: Zhang, M., Ng, V., Zhao, D., Li, S., Zan, H. (eds) Natural Language Processing and Chinese Computing. NLPCC 2018. Lecture Notes in Computer Science(), vol 11108. Springer, Cham. https://doi.org/10.1007/978-3-319-99495-6_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-99495-6_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-99494-9
Online ISBN: 978-3-319-99495-6
eBook Packages: Computer ScienceComputer Science (R0)