Skip to content

Leveraging RoBERTa for Abstractive Summarization using Pre-trained Encoders

Notifications You must be signed in to change notification settings

the-sergiu/Abstractive-Summarization-RoBERTa2RoBERTa

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Abstractive Summarization using a Pre-trained Roberta2Roberta Encoder-Decoder architecture

This paper presents a comprehensive exploration of fine-tuning a RoBERTa encoder-decoder model for abstractive text summarization. The approach involves using a pre-trained RoBERTa model on a vast text corpus and fine-tuning the encoder-decoder architecture on a summarization dataset. Extensive experiments on benchmark datasets demonstrate that the fine-tuned RoBERTa encoder-decoder has performances that are comparable with the existing methods in terms of summarization quality. The study also delves into the impact of data size, domain-specific fine-tuning, and transfer learning, highlighting the adaptability of RoBERTa-based models for generating coherent and informative summaries across diverse domains, contributing to the field of abstractive summarization research.

The paper can be found as paper_abstractive_summarization.pdf.

About

Leveraging RoBERTa for Abstractive Summarization using Pre-trained Encoders

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published