Two-Stage Text Summarization
We experiment with a 2-stage summarization model on CNN/DailyMail dataset that combines the ability to filter informative sentences (like extractive summarization) and the ability to paraphrase (like abstractive summarization). Our best model achieves a ROUGE-L F1 score of 39.82, which outperforms the strong Lead-3 baseline and BERTSumEXT. Qualitative analysis indicates better readability and factual accuracy. Further, fine-tuning both stages on our oracle as the gold references shows the potential to outperform BART.
conda create -n text-sum python=3.8 conda activate text-sum pip install -r src/requirements.txt