{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,5,10]],"date-time":"2024-05-10T16:54:36Z","timestamp":1715360076368},"reference-count":42,"publisher":"Association for Computing Machinery (ACM)","issue":"6","funder":[{"DOI":"10.13039\/100018883","name":"Thu Dau Mot University","doi-asserted-by":"crossref","id":[{"id":"10.13039\/100018883","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["ACM Trans. Asian Low-Resour. Lang. Inf. Process."],"published-print":{"date-parts":[[2021,11,30]]},"abstract":"\n Recently, advanced techniques in deep learning such as recurrent neural network (GRU, LSTM and Bi-LSTM) and auto-encoding (attention-based transformer and BERT) have achieved great successes in multiple application domains including text summarization. Recent state-of-the-art encoding-based text summarization models such as BertSum, PreSum and DiscoBert have demonstrated significant improvements on extractive text summarization tasks. However, recent models still encounter common problems related to the language-specific dependency which requires the supports of the external NLP tools. Besides that, recent advanced text representation methods, such as BERT as the sentence-level textual encoder, also fail to fully capture the representation of a full-length document. To address these challenges, in this paper we proposed a novel\n \n s<\/jats:underline>\n emantic-ware\n e<\/jats:underline>\n mbedding approach for\n ex<\/jats:underline>\n tractive text\n sum<\/jats:underline>\n marization\n <\/jats:bold>\n , called as:\n SE4ExSum.<\/jats:bold>\n Our proposed SE4ExSum is an integration between the use of\n feature graph-of-words (FGOW)<\/jats:bold>\n with BERT-based encoder for effectively learning the word\/sentence-level representations of a given document. Then, the\n \n g<\/jats:underline>\n raph\n c<\/jats:underline>\n onvolutional\n n<\/jats:underline>\n etwork (GCN)\n <\/jats:bold>\n based encoder is applied to learn the global document's representation which is then used to facilitate the text summarization task. Extensive experiments on benchmark datasets show the effectiveness of our proposed model in comparing with recent state-of-the-art text summarization models.\n <\/jats:p>","DOI":"10.1145\/3464426","type":"journal-article","created":{"date-parts":[[2021,9,1]],"date-time":"2021-09-01T19:16:06Z","timestamp":1630523766000},"page":"1-22","update-policy":"http:\/\/dx.doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":5,"title":["SE4ExSum: An Integrated Semantic-aware Neural Approach with Graph Convolutional Network for Extractive Text Summarization"],"prefix":"10.1145","volume":"20","author":[{"ORCID":"http:\/\/orcid.org\/0000-0001-7291-4168","authenticated-orcid":false,"given":"Tham","family":"Vo","sequence":"first","affiliation":[{"name":"Thu Dau Mot University, Binh Duong, Vietnam"}]}],"member":"320","published-online":{"date-parts":[[2021,9]]},"reference":[{"key":"e_1_2_1_1_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10489-015-0747-x"},{"key":"e_1_2_1_2_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10489-015-0753-z"},{"key":"e_1_2_1_3_1","first-page":"113679","article-title":"2020. Automatic text summarization: A comprehensive survey","author":"El-Kassas W. S.","year":"2020","unstructured":"W. S. El-Kassas , C. R. Salama , A. A. Rafea , and H. K. Mohamed . 2020. Automatic text summarization: A comprehensive survey . Expert Systems with Applications 113679 , 2020 . W. S. El-Kassas, C. R. Salama, A. A. Rafea, and H. K. Mohamed. 2020. Automatic text summarization: A comprehensive survey. Expert Systems with Applications 113679, 2020.","journal-title":"Expert Systems with Applications"},{"key":"e_1_2_1_4_1","volume-title":"Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics","author":"Liu P. J.","year":"2017","unstructured":"A. See, P. J. Liu , and C. D. Manning . 2017. Get to the point: Summarization with pointer-generator networks . In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics 2017 . A. See, P. J. Liu, and C. D. Manning. 2017. Get to the point: Summarization with pointer-generator networks. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics 2017."},{"key":"e_1_2_1_5_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10462-016-9475-9"},{"key":"e_1_2_1_6_1","volume-title":"3rd International Conference on Learning Representations (ICLR)","author":"Bahdanau D.","year":"2015","unstructured":"D. Bahdanau , K. Cho , and Y. Bengio . 2015. Neural machine translation by jointly learning to align and translate . In 3rd International Conference on Learning Representations (ICLR) , 2015 . D. Bahdanau, K. Cho, and Y. Bengio. 2015. Neural machine translation by jointly learning to align and translate. In 3rd International Conference on Learning Representations (ICLR), 2015."},{"key":"e_1_2_1_7_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/D18-1208"},{"key":"e_1_2_1_8_1","volume-title":"Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics","author":"Cheng J.","year":"2016","unstructured":"J. Cheng and M. Lapata . 2016. Neural summarization by extracting sentences and words . In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics , 2016 . J. Cheng and M. Lapata. 2016. Neural summarization by extracting sentences and words. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 2016."},{"key":"e_1_2_1_9_1","unstructured":"W. Xu S. Li and Y. Lu. 2020. Usr-mtl: an unsupervised sentence representation learning framework with multi-task learning. Applied Intelligence 1\u201316 2020. W. Xu S. Li and Y. Lu. 2020. Usr-mtl: an unsupervised sentence representation learning framework with multi-task learning. Applied Intelligence 1\u201316 2020."},{"key":"e_1_2_1_10_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10489-020-01680-w"},{"key":"e_1_2_1_11_1","volume-title":"Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies","author":"Peters M. E.","year":"2018","unstructured":"M. E. Peters , M. Neumann , M. Iyyer , M. Gardner , C. Clark , K. Lee , and L. Zettlemoyer . 2018. Deep contextualized word representations . In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies , 2018 . M. E. Peters, M. Neumann, M. Iyyer, M. Gardner, C. Clark, K. Lee, and L. Zettlemoyer. 2018. Deep contextualized word representations. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2018."},{"key":"e_1_2_1_12_1","volume-title":"OpenAI","author":"Radford A.","year":"2018","unstructured":"A. Radford , K. Narasimhan , T. Salimans , and I. Sutskever . 2018. Improving language understanding by generative pre-training . OpenAI , 2018 . A. Radford, K. Narasimhan, T. Salimans, and I. Sutskever. 2018. Improving language understanding by generative pre-training. OpenAI, 2018."},{"key":"e_1_2_1_13_1","volume-title":"Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies","author":"Devlin J.","year":"2019","unstructured":"J. Devlin , M. W. Chang , K. Lee , and K. Toutanova . 2019. Bert: Pre-training of deep bidirectional transformers for language understanding . In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies , 2019 . J. Devlin, M. W. Chang, K. Lee, and K. Toutanova. 2019. Bert: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019."},{"key":"e_1_2_1_14_1","volume-title":"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)","author":"Liu Y.","year":"2019","unstructured":"Y. Liu and M. Lapata . 2019. Text summarization with pretrained encoders . In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) , 2019 . Y. Liu and M. Lapata. 2019. Text summarization with pretrained encoders. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 2019."},{"key":"e_1_2_1_15_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/K19-1074"},{"key":"e_1_2_1_16_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/D18-1207"},{"key":"e_1_2_1_17_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/D19-1298"},{"key":"e_1_2_1_18_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/2020.acl-main.451"},{"key":"e_1_2_1_19_1","volume-title":"5th International Conference on Learning Representations, ICLR","author":"Kipf T. N.","year":"2017","unstructured":"T. N. Kipf and M. Welling . 2017. Semi-supervised classification with graph convolutional networks . In 5th International Conference on Learning Representations, ICLR , 2017 . T. N. Kipf and M. Welling. 2017. Semi-supervised classification with graph convolutional networks. In 5th International Conference on Learning Representations, ICLR, 2017."},{"key":"e_1_2_1_20_1","volume-title":"International Conference on Learning Representations","author":"Paulus R.","year":"2018","unstructured":"R. Paulus , C. Xiong , and R. Socher . 2018. A deep reinforced model for abstractive summarization . In International Conference on Learning Representations , 2018 . R. Paulus, C. Xiong, and R. Socher. 2018. A deep reinforced model for abstractive summarization. In International Conference on Learning Representations, 2018."},{"key":"e_1_2_1_21_1","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2019.2941964"},{"key":"e_1_2_1_22_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/D18-1443"},{"key":"e_1_2_1_23_1","volume-title":"Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing","author":"Li W.","year":"2018","unstructured":"W. Li , X. Xiao , Y. Lyu , and Y. Wang . 2018. Improving neural abstractive document summarization with explicit information selection modeling . In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing , 2018 . W. Li, X. Xiao, Y. Lyu, and Y. Wang. 2018. Improving neural abstractive document summarization with explicit information selection modeling. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018."},{"key":"e_1_2_1_24_1","volume-title":"Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies","author":"Pasunuru R.","year":"2018","unstructured":"R. Pasunuru and M. Bansal . 2018. Multi-reward reinforced summarization with saliency and entailment . In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies , 2018 . R. Pasunuru and M. Bansal. 2018. Multi-reward reinforced summarization with saliency and entailment. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2018."},{"key":"e_1_2_1_25_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/D19-1320"},{"key":"e_1_2_1_26_1","volume-title":"Proceedings of the AAAI Conference on Artificial Intelligence","author":"Liu L.","year":"2018","unstructured":"L. Liu , Y. Lu , M. Yang , Q. Qu , J. Zhu , and H. Li . 2018. Generative adversarial network for abstractive text summarization . In Proceedings of the AAAI Conference on Artificial Intelligence , 2018 . L. Liu, Y. Lu, M. Yang, Q. Qu, J. Zhu, and H. Li. 2018. Generative adversarial network for abstractive text summarization. In Proceedings of the AAAI Conference on Artificial Intelligence, 2018."},{"key":"e_1_2_1_27_1","doi-asserted-by":"publisher","DOI":"10.1162\/dint_a_00014"},{"key":"e_1_2_1_28_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.asoc.2017.04.069"},{"key":"e_1_2_1_29_1","doi-asserted-by":"publisher","DOI":"10.5555\/3298483.3298681"},{"key":"e_1_2_1_30_1","volume-title":"Proceedings of the AAAI Conference on Artificial Intelligence.","author":"Zhang Z.","unstructured":"Z. Zhang , Y. Wu , H. Zhao , Z. Li , S. Zhang , X. Zhou , and X. Zhou . Semantics-aware BERT for language understanding . In Proceedings of the AAAI Conference on Artificial Intelligence. Z. Zhang, Y. Wu, H. Zhao, Z. Li, S. Zhang, X. Zhou, and X. Zhou. Semantics-aware BERT for language understanding. In Proceedings of the AAAI Conference on Artificial Intelligence."},{"key":"e_1_2_1_31_1","doi-asserted-by":"publisher","DOI":"10.3115\/v1\/P14-5010"},{"key":"e_1_2_1_32_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.knosys.2020.106649"},{"key":"e_1_2_1_33_1","volume-title":"Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics","author":"Howard J.","year":"2018","unstructured":"J. Howard and S. Ruder . 2018. Universal language model fine-tuning for text classification . In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics , 2018 . J. Howard and S. Ruder. 2018. Universal language model fine-tuning for text classification. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2018."},{"key":"e_1_2_1_34_1","doi-asserted-by":"publisher","DOI":"10.5555\/3295222.3295349"},{"key":"e_1_2_1_35_1","volume-title":"1st International Conference on Learning Representations (ICRL)","author":"Mikolov T.","year":"2013","unstructured":"T. Mikolov , K. Chen , G. Corrado , and J. Dean . 2013. Efficient estimation of word representations in vector space . In 1st International Conference on Learning Representations (ICRL) , 2013 . T. Mikolov, K. Chen, G. Corrado, and J. Dean. 2013. Efficient estimation of word representations in vector space. In 1st International Conference on Learning Representations (ICRL), 2013."},{"key":"e_1_2_1_36_1","doi-asserted-by":"publisher","DOI":"10.3115\/v1\/D14-1162"},{"key":"e_1_2_1_37_1","volume-title":"Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing","author":"Rousseau F.","year":"2015","unstructured":"F. Rousseau , E. Kiagias , and M. Vazirgiannis . 2015. Text categorization as a graph classification problem . In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing , 2015 . F. Rousseau, E. Kiagias, and M. Vazirgiannis. 2015. Text categorization as a graph classification problem. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, 2015."},{"key":"e_1_2_1_38_1","volume-title":"Proceedings of the AAAI Conference on Artificial Intelligence","author":"Yao L.","year":"2019","unstructured":"L. Yao , C. Mao , and Y. Luo . 2019. Graph convolutional networks for text classification . In Proceedings of the AAAI Conference on Artificial Intelligence 2019 . L. Yao, C. Mao, and Y. Luo. 2019. Graph convolutional networks for text classification. In Proceedings of the AAAI Conference on Artificial Intelligence 2019."},{"key":"e_1_2_1_39_1","volume-title":"Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, EMNLP","author":"Lee K.","year":"2017","unstructured":"K. Lee , L. He , M. Lewis , and L. Zettlemoyer . 2017. End-to-end neural coreference resolution . In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, EMNLP , 2017 . K. Lee, L. He, M. Lewis, and L. Zettlemoyer. 2017. End-to-end neural coreference resolution. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, EMNLP, 2017."},{"key":"e_1_2_1_40_1","volume-title":"Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics","author":"Durrett G.","year":"2016","unstructured":"G. Durrett , T. Berg-Kirkpatrick , and D. Klein . 2016. Learning-based single-document summarization with compression and anaphoricity constraints . In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics , 2016 . G. Durrett, T. Berg-Kirkpatrick, and D. Klein. 2016. Learning-based single-document summarization with compression and anaphoricity constraints. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 2016."},{"key":"e_1_2_1_41_1","volume-title":"Rouge: A package for automatic evaluation of summaries. In Text Summarization Branches Out","author":"Lin C. Y.","year":"2004","unstructured":"C. Y. Lin . 2004 . Rouge: A package for automatic evaluation of summaries. In Text Summarization Branches Out 2004. C. Y. Lin. 2004. Rouge: A package for automatic evaluation of summaries. In Text Summarization Branches Out 2004."},{"key":"e_1_2_1_42_1","volume-title":"Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics","author":"Wang W.","year":"2016","unstructured":"W. Wang and B. Chang . 2016. Graph-based dependency parsing with bidirectional LSTM . In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics , 2016 . W. Wang and B. Chang. 2016. Graph-based dependency parsing with bidirectional LSTM. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 2016."}],"container-title":["ACM Transactions on Asian and Low-Resource Language Information Processing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3464426","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,1,1]],"date-time":"2023-01-01T21:47:53Z","timestamp":1672609673000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3464426"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,9]]},"references-count":42,"journal-issue":{"issue":"6","published-print":{"date-parts":[[2021,11,30]]}},"alternative-id":["10.1145\/3464426"],"URL":"https:\/\/doi.org\/10.1145\/3464426","relation":{},"ISSN":["2375-4699","2375-4702"],"issn-type":[{"value":"2375-4699","type":"print"},{"value":"2375-4702","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,9]]},"assertion":[{"value":"2020-12-01","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2021-05-01","order":1,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2021-09-01","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}