Abstract
Deep learning classification model has achieved good results in many text classification tasks. However, these methods generally consider single text feature, and it is difficult to represent short text. To this end, this paper proposes a text classification method based on combines label information and self-attention graph convolutional neural network. This method introduces an attention mechanism based on label information to obtain more accurate text representation, and it extracts the global features of multiple texts by self-attention graph convolution network for text classification. The experimental results based on R8 and MR show that compared with the latest model, our model increases the F1 values and accuracy by 2.58% and by 2.02% on the MR data; and increases the F1 values and accuracy by 3.52% and by 2.52% on R8 dataset.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kim, Y.: Convolutional neural networks for sentence classification. EprintArxiv (2014)
Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recur-rent neural networks on sequence modeling. arXiv preprint arXiv (2016)
Henaff, M., Bruna, J., Le Cun, Y.: Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163 (2015)
Joachims, T.: Text categorization with support vector machines: learning with many relevant features. In: Nédellec, C., Rouveirol, C. (eds.) Machine Learning: ECML-98. LNCS, vol. 1398, pp. 137–142. Springer, Berlin (1998). https://doi.org/10.1007/BFb0026683
Mladenic, D., Grobelnik, M.: Feature selection for unbalanced class distribution and Naive Bayes. In: 16th International Conference on Machine Learning, Slovenia, Bled, pp. 258–267 (1999)
Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)
Luo, Y.: Recurrent neural networks for classifying relations in clinical notes. J. Biomed. Inform. 72, 85–95 (2017)
Zhang, Y., Lu, W., Ou, W., Zhang, G., Zhang, X., Cheng, J., Zhang, W.: Chinese medical question answer selection via hybrid models based on CNN and GRU. Multimedia Tools Appl. 79(21–22), 14751–14776 (2019). https://doi.org/10.1007/s11042-019-7240-1
Yao, L., Mao C.: Graph convolutional networks for text classification. In: the AAAI Conference on Artificial Intelligence, vol. 33, pp. 7370–7377 (2019)
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017)
Natural Language Toolkit Homepage. https://www.nltk.org/. Accessed 14 Oct 2020
Wang, G., et al.: Joint em-bedding of words and labels for text classification. arXiv preprint arXiv:1805.04174 (2018)
Zhang, X., Zhao, J., Le Cun, Y.: Character-level convolutional networks for text classification. In Advances in Neural Information Processing Systems, pp. 649–657 (2015)
Shen, D., et al.: Baseline needs more love: on simple word-embedding-based models and associated pooling mechanisms. arXiv preprint arXiv:1805.09843 (2018)
Acknowledgments
This work was supported by the National Natural Science Foundation of China under Grant 61966020.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Wang, H., Luo, G., Li, R. (2021). A Short Text Classification Method Based on Combining Label Information and Self-attention Graph Convolutional Neural Network. In: Sun, Y., Liu, D., Liao, H., Fan, H., Gao, L. (eds) Computer Supported Cooperative Work and Social Computing. ChineseCSCW 2020. Communications in Computer and Information Science, vol 1330. Springer, Singapore. https://doi.org/10.1007/978-981-16-2540-4_50
Download citation
DOI: https://doi.org/10.1007/978-981-16-2540-4_50
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-2539-8
Online ISBN: 978-981-16-2540-4
eBook Packages: Computer ScienceComputer Science (R0)