We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Abstractive Text Summarization Model with Coherence Reinforcement and No Ground Truth Dependency.
- Authors
CHEN Gongchi; RONG Huan; MA Tinghuai
- Abstract
Automatic text summarization aims to compress a given document, which can efficiently reflect the main idea of the source document with a short summary. At present, abstractive summarization method has become a research hotspot in the field of text summarization because it can paraphrase the source document with flexible and abundant vocabulary. However, existing abstractive summarization model reorganizes original words and adds new words when generating summary. That's why it can easily cause the inconsistency and low readability. In addition, the traditional supervised learning based on labeled data requires high cost to improve the coherence of summary sentences, which limits the practical application. Therefore, this paper proposes an abstractive text summarization model with coherence reinforcement and no ground truth dependency (ATS_CG). On the one hand, based on the embdding of the source document, the model generates extractive label to describe the filtering process of the key information. And then, the filtered sentence embeddings are decoded by the decoder. On the other hand, based on the original word probability distribution output by the decoder, two types of summarization are generated according to "probability selection"and"Softmax-greedy selection". And then, the model will compute the overall rewards of the two types of summarization from the two aspects of coherence and content. Next, the model will learn to filter key sentences and decode them through the self-critical policy gradient, so as to generate abstractive summarizaion with high coherence and quality. Experiments show that ATS_CG is superior to the existing text summarization methods in terms of evaluation scores on the whole, even without any ground truth. At the same time, abstractive summarization generated by ATS_CG is also better than the existing methods in coherence, relevance, redundancy, novelty and perplexity.
- Subjects
DISTRIBUTION (Probability theory); SUPERVISED learning; REWARD (Psychology); NATURAL language processing; NEW words
- Publication
Journal of Frontiers of Computer Science & Technology, 2022, Vol 16, Issue 3, p621
- ISSN
1673-9418
- Publication type
Article
- DOI
10.3778/j.issn.1673-9418.2109014