A Short Text Classification Method Based on Combining Label Information and Self-attention Graph Convolutional Neural Network | SpringerLink
Skip to main content

A Short Text Classification Method Based on Combining Label Information and Self-attention Graph Convolutional Neural Network

  • Conference paper
  • First Online:
Computer Supported Cooperative Work and Social Computing (ChineseCSCW 2020)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1330))

  • 1186 Accesses

Abstract

Deep learning classification model has achieved good results in many text classification tasks. However, these methods generally consider single text feature, and it is difficult to represent short text. To this end, this paper proposes a text classification method based on combines label information and self-attention graph convolutional neural network. This method introduces an attention mechanism based on label information to obtain more accurate text representation, and it extracts the global features of multiple texts by self-attention graph convolution network for text classification. The experimental results based on R8 and MR show that compared with the latest model, our model increases the F1 values and accuracy by 2.58% and by 2.02% on the MR data; and increases the F1 values and accuracy by 3.52% and by 2.52% on R8 dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 16015
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 20019
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Kim, Y.: Convolutional neural networks for sentence classification. EprintArxiv (2014)

    Google Scholar 

  2. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recur-rent neural networks on sequence modeling. arXiv preprint arXiv (2016)

    Google Scholar 

  3. Henaff, M., Bruna, J., Le Cun, Y.: Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163 (2015)

  4. Joachims, T.: Text categorization with support vector machines: learning with many relevant features. In: Nédellec, C., Rouveirol, C. (eds.) Machine Learning: ECML-98. LNCS, vol. 1398, pp. 137–142. Springer, Berlin (1998). https://doi.org/10.1007/BFb0026683

    Chapter  Google Scholar 

  5. Mladenic, D., Grobelnik, M.: Feature selection for unbalanced class distribution and Naive Bayes. In: 16th International Conference on Machine Learning, Slovenia, Bled, pp. 258–267 (1999)

    Google Scholar 

  6. Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)

    Google Scholar 

  7. Luo, Y.: Recurrent neural networks for classifying relations in clinical notes. J. Biomed. Inform. 72, 85–95 (2017)

    Article  Google Scholar 

  8. Zhang, Y., Lu, W., Ou, W., Zhang, G., Zhang, X., Cheng, J., Zhang, W.: Chinese medical question answer selection via hybrid models based on CNN and GRU. Multimedia Tools Appl. 79(21–22), 14751–14776 (2019). https://doi.org/10.1007/s11042-019-7240-1

    Article  Google Scholar 

  9. Yao, L., Mao C.: Graph convolutional networks for text classification. In: the AAAI Conference on Artificial Intelligence, vol. 33, pp. 7370–7377 (2019)

    Google Scholar 

  10. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017)

  11. Natural Language Toolkit Homepage. https://www.nltk.org/. Accessed 14 Oct 2020

  12. Wang, G., et al.: Joint em-bedding of words and labels for text classification. arXiv preprint arXiv:1805.04174 (2018)

  13. Zhang, X., Zhao, J., Le Cun, Y.: Character-level convolutional networks for text classification. In Advances in Neural Information Processing Systems, pp. 649–657 (2015)

    Google Scholar 

  14. Shen, D., et al.: Baseline needs more love: on simple word-embedding-based models and associated pooling mechanisms. arXiv preprint arXiv:1805.09843 (2018)

Download references

Acknowledgments

This work was supported by the National Natural Science Foundation of China under Grant 61966020.

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, H., Luo, G., Li, R. (2021). A Short Text Classification Method Based on Combining Label Information and Self-attention Graph Convolutional Neural Network. In: Sun, Y., Liu, D., Liao, H., Fan, H., Gao, L. (eds) Computer Supported Cooperative Work and Social Computing. ChineseCSCW 2020. Communications in Computer and Information Science, vol 1330. Springer, Singapore. https://doi.org/10.1007/978-981-16-2540-4_50

Download citation

  • DOI: https://doi.org/10.1007/978-981-16-2540-4_50

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-16-2539-8

  • Online ISBN: 978-981-16-2540-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics