Automatic Text Summarization Using Sequence-to Sequence Model and Recurrent Neural Network | Gupta | Computación y Sistemas

Automatic Text Summarization Using Sequence-to Sequence Model and Recurrent Neural Network

Pooja Gupta, Swati Nigam, Rajiv Singh

Abstract


Presently, the Internet serves as a repository for an extensive variety of content, encompassing scholarly articles, current affairs, blog posts, and social media updates. This category of electronic data comprises a vast quantity of information that requires management, organisation, and storage. Text summarization is a procedure that reduces the size of substantial amounts of textual data to a more feasible format. The process of summarising English-language literature can be approached in a variety of ways. Conversely, a limited number of them find application in low-resource languages such as Hindi. For the purpose of text summarization in English and Hindi, we implemented a sequence-to-sequence (Seq2Seq) encoder decoder model with a recurrent neural network (RNN) and presented two methods: an extractive method and an abstractive method. By applying the suggested methodology, we successfully extracted the most critical sentences from the input documents through the use of extractive text summarization. Abstractive text summarization utilizes a natural language generation approach to produce a summary that maintains the integrity of the original content. The fundamental objective of this research is to generate abstractive and extractive summaries in both Hindi and English for the identical datasets. We compiled the summaries using four datasets, two of which were the CNN News and BBC News datasets and were utilized for the English-Hindi parallel corpus. Conversely, for Hindi summaries only, the Indian language text corpus dataset and the Hindi text summarization corpus dataset are used. We exploit the ROUGE metric with specific parameters, including F-measure, precision, and recall, to assess the method. Experimental results are compared with the existing state-of-the-art methods.

Keywords


Abstractive summarization, extractive summarization, deep learning, Seq2Seq, recurrent neural network, ROUGE.

Full Text: PDF