EBSCO Logo
Connecting you to content on EBSCOhost
Results
Title

CGM: Copy Mechanism GPT with Mask for Ellipsis and Anaphora Resolution in Dialogue.

Authors

Cho, Ji-Won; Oh, Jinyoung; Cha, Jeong-Won

Abstract

GPT (Generative Pre-trained Transformer) is a generative language model that demonstrates outstanding performance in the field of text generation. Generally, the attention mechanism of the transformer model behaves similarly to a copy distribution. However, due to the absence of a dedicated encoder, it is challenging to ensure that the input is retained for generation. We propose a model that emphasizes the copy mechanism in GPT. We generate masks for the input words to initialize the distribution and explicitly encourage copying through training. To demonstrate the effectiveness of our approach, we conducted experiments to restore ellipsis and anaphora in dialogue. In a single domain, we achieved 0.4319 (BLEU), 0.6408 (Rouge-L), 0.9040 (simCSE), and 0.9070 (BERTScore), while in multi-domain settings we obtained 0.4611 (BLEU), 0.6379 (Rouge-L), 0.8902 (simCSE), and 0.8999 (BERTScore). Additionally, we evaluated the operation of the copy mechanism on out-of-domain data, yielding excellent results. We anticipate that applying the copy mechanism to GPT will be useful for utilizing language models in constrained situations.

Subjects

GENERATIVE pre-trained transformers; LANGUAGE models; ANAPHORA (Linguistics)

Publication

Applied Sciences (2076-3417), 2025, Vol 15, Issue 1, p5

ISSN

2076-3417

Publication type

Academic Journal

DOI

10.3390/app15010005

EBSCO Connect | Privacy policy | Terms of use | Copyright | Manage my cookies
Journals | Subjects | Sitemap
© 2025 EBSCO Industries, Inc. All rights reserved