We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Improving semantic coverage of data-to-text generation model using dynamic memory networks.
- Authors
Seifossadat, Elham; Sameti, Hossein
- Abstract
This paper proposes a sequence-to-sequence model for data-to-text generation, called DM-NLG, to generate a natural language text from structured nonlinguistic input. Specifically, by adding a dynamic memory module to the attention-based sequence-to-sequence model, it can store the information that leads to generate previous output words and use it to generate the next word. In this way, the decoder part of the model is aware of all previous decisions, and as a result, the generation of duplicate words or incomplete semantic concepts is prevented. To improve the generated sentences quality by the DM-NLG decoder, a postprocessing step is performed using the pretrained language models. To prove the effectiveness of the DM-NLG model, we performed experiments on five different datasets and observed that our proposed model is able to reduce the slot error rate rate by 50% and improve the BLEU by 10%, compared to the state-of-the-art models.
- Subjects
LANGUAGE models; NATURAL languages; DYNAMIC models; RECURRENT neural networks; ERROR rates
- Publication
Natural Language Engineering, 2024, Vol 30, Issue 3, p454
- ISSN
1351-3249
- Publication type
Article
- DOI
10.1017/S1351324923000207