We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Abstractive Sentence Compression with Event Attention.
- Authors
Choi, Su Jeong; Jung, Ian; Park, Seyoung; Park, Seong-Bae
- Abstract
Sentence compression aims at generating a shorter sentence from a long and complex source sentence while preserving the important content of the source sentence. Since it provides enhanced comprehensibility and readability to readers, sentence compression is required for summarizing news articles in which event words play a key role in delivering the meaning of the source sentence. Therefore, this paper proposes an abstractive sentence compression with event attention. In compressing a sentence of news articles, event words should be preserved as important information for sentence compression. For this, event attention is proposed which focuses on the event words of the source sentence in generating a compressed sentence. The global information in the source sentence is as significant as event words, since it captures the information of a whole source sentence. As a result, the proposed model generates a compressed sentence by combining both attentions. According to experimental results, the proposed model outperforms both the normal sequence-to-sequence model and the pointer generator on three datasets, namely the MSR dataset, Filippova dataset, and Korean sentence compression dataset. In particular, it shows 122% higher BLEU score than the sequence-to-sequence model. Therefore, the proposed model is effective in sentence compression.
- Subjects
PLAYS on words; CRIMINAL sentencing; INFORMATION resources; ATTENTION; KEYWORDS
- Publication
Applied Sciences (2076-3417), 2019, Vol 9, Issue 19, p3949
- ISSN
2076-3417
- Publication type
Article
- DOI
10.3390/app9193949