Please use this identifier to cite or link to this item: http://10.1.7.192:80/jspui/handle/123456789/11862
Title: Abstractive Text Summarization
Authors: Godhani, Prashant
Keywords: Computer 2021
Project Report 2021
Computer Project Report
Project Report
21MCE
21MCEC
21MCEC01
Issue Date: 1-Jun-2023
Publisher: Institute of Technology
Series/Report no.: 21MCEC01;21MCEC01
Abstract: In today's society, a lot of information is released every day in various formats, including text, video, and images. Textual data is becoming increasingly complex and dense. It is necessary to get only important information from large documents. Text summarising techniques are introduced for this problem. Abstractive text summarising is a method for producing a summary that delivers the most significant information from the original document. In this research, we explore the effectiveness of two different deep learning algorithms, Seq2seq GRU and Transformer-based architectures using multi-head attention, for abstractive text summarization. We use the CNN/DailyMail dataset containing articles and highlights for our experiments. We compare the performance of both models using Rouge scores. The Seq2seq model achieved ROUGE-1, ROUGE-2, and ROUGE-L scores of 16.51, 12.21, and 15.21 respectively. However, after applying the Transformer model with multi-head attention, the scores significantly improved to ROUGE-1, ROUGE-2, and ROUGE-L scores are 26.21, 35.41, and 20.58. This demonstrates that the Transformer-based model outperforms the Seq2seq GRU model in terms of ROUGE scores. The results of this study highlight the effectiveness of different deep learning models for abstractive text summarization. The proposed Transformer model with multi-head attention proves to be a superior choice. However, there are still potential areas for improvement that could be explored in future work.
URI: http://10.1.7.192:80/jspui/handle/123456789/11862
Appears in Collections:Dissertation, CE

Files in This Item:
File Description SizeFormat 
21MCEC01.pdf21MCEC01494.4 kBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.