L-RCap: RNN-capsule model via label semantics for MLTC | Applied Intelligence Skip to main content
Log in

L-RCap: RNN-capsule model via label semantics for MLTC

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Multi-label text classification attempts to assign a label set to one specific document, which is more closely related to real life. Network models based on traditional deep learning achieve good prediction results. However, these models generally ignore the importance of label semantics and do not fit the connection between categories and text features well. Therefore, this paper proposes a novel L-RCap model. L-RCap uses the Bi-LSTM to extract global text features. With the global text features, we can use the label semantics to construct label-text features in the label semantic attention mechanism. Besides, we use the capsule network to extend the features information and use the dynamic routing algorithm to fit the association between features and categories. Compared with the baseline models, our model exhibits the best performance on two datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Algorithm 1
Fig. 3

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. http://www.ai.mit.edu/projects/jmlr/papers/volume5/lewis04a/lyrl2004_rcv1v2_README.htm

  2. https://github.com/lancopku/SGM/issues/25

References

  1. Liu W, Shen X, Wang H, Tsang IW (2020) The emerging trends of multi-label learning. arXiv:2011.11197

  2. Kim Y (2014) Convolutional neural networks for sentence classification. In: EMNLP, pp 1746–1751

  3. Zhang X, Zhao J, Lecun Y (2015) Character-level convolutional networks for text classification. In: Neural information processing systems, pp 649–657

  4. Berger MJ (2014) Large scale multi-label text classification with semantic word vectors. In: Technical report, pp 1–8

  5. Liu P, Qiu X, Xuanjing H (2016) Recurrent neural network for text classification with multi-task learning. In: IJCAI, pp 2873–2879

  6. Lai S, Xu L, Liu K, Zhao J (2015) Recurrent convolutional neural networks for text classification. In: AAAI, pp 2267–2273

  7. Sabour S, Frosst N, Hinton GE (2017) Dynamic routing between capsules. In: Neural information processing systems, pp 3859–3869

  8. Zhao W, Ye J, Yang M et al (2020) Investigating capsule networks with dynamic routing for text classification. In: EMNLP, pp 3110–3119

  9. Zhao W, Peng H, Eger S et al (2019) Towards scalable and reliable capsule networks for challenging NLP applications. In: ACL, pp 1549–1559

  10. Wu Y, Li J, Wu J, Chang J (2020) Siamese capsule networks with global and local features for text classification. Neurocomputing 390:88–98

    Article  Google Scholar 

  11. Kim J, Jang S, Park E, Choi S (2020) Text classification using capsules. Neurocomputing 376:214–221

    Article  Google Scholar 

  12. Chen Z, Qian T (2019) Transfer capsule network for aspect level sentiment classification. In: ACL, vol 2019, pp 547-556

  13. Hinton G, Sabour S, Frosst N (2018) Matrix capsules with EM routing. In: ICLR

  14. Mazzia V, Salvetti F, Chiaberge M (2021) Efficient-capsnet: capsule network with self-attention routing. Sci Reports, vol 11(1)

  15. Zhang X, Li P, Jia W, Zhao H (2019) Multi-labeled relation extraction with attentive capsule network. In: AAAI, pp 7484–7491

  16. Wang G, Li C, Wang W et al (2018) Joint embedding of words and labels for text classification. In: ACL, pp 2321–2331

  17. Xiao L, Huang X, Chen B, Jing L (2020) Label-specific document representation for multi-label text classification. In: IJCNLP, pp 466–475

  18. Lei K, Fu Q, Yang M, Liang Y (2020) Tag recommendation by text classification with attention-based capsule network. Neurocomputing 391:65–73

    Article  Google Scholar 

  19. Weston J, Bengio S, Usunier N (2011) WSABIE: scaling up to large vocabulary image annotation. In: IJCAI, pp 2764–2770

  20. Gao K, Zhang J, Zhou C (2019) Semi-supervised graph embedding for multi-label graph node classification. In: Web information systems engineering, pp 555–567

  21. Pappas N, Henderson J (2019) GILE: a generalized input-label embedding for text classification. Trans Assoc Comput Linguist 7:139–155

    Article  Google Scholar 

  22. Du C, Chen Z, Feng F et al (2019) Explicit interaction model towards text classification. In: AAAI, pp 2159–5399

  23. Chen Z, Ren J (2021) Multi-label text classification with latent word-wise label information. Appl Intell 51:966–979

    Article  Google Scholar 

  24. Zhang W, Yan J, Wang X, Zha H (2018) Deep extreme multi-label learning. In: ICMR, pp 100–107

  25. Wang D, Cui P, Zhu W (2016) Structural deep network embedding. In: Proceedings of the ACM SIGKDD international conference on knowledge discovery and data mining, pp 1225–1234

  26. Lewis DD, Yang Y, Rose TG, Li F (2004) RCV1: a new benchmark collection for text categorization research. J Mach Learn Res 5:361–397

    Google Scholar 

  27. Yang P, Sun X, Li W, Ma S, Wu W, Wang H (2018) SGM: sequence generation model for multi-label classification. In: COLING, pp 3915–3926

  28. Kingma DP, Ba JL (2015) Adam: a method for stochastic optimization. In: ICLR, pp 1–15

  29. Srivastava N, Hinton G, Krizhevsky A et al (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res:1929–1958

  30. Liu J, Chang WC, Wu Y, Yang Y (2017) Deep learning for extreme multi-label text classification. In: SIGIR pp 115–124

  31. You R, Zhang Z, Wang Z et al (2019) AttentionXML: label tree-based attention-aware deep model for high-performance extreme multi-label text classification. In: Neural information processing systems

Download references

Acknowledgements

This work is supported by the Hebei Provincial Department of education in 2021 provincial postgraduate demonstration course project construction under Grant KCJSX2021024.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiuling Zhang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, X., Luo, Z., Du, B. et al. L-RCap: RNN-capsule model via label semantics for MLTC. Appl Intell 53, 14961–14970 (2023). https://doi.org/10.1007/s10489-022-04286-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-04286-6

Keywords

Navigation