We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Hypert: hypernymy-aware BERT with Hearst pattern exploitation for hypernym discovery.
- Authors
Yun, Geonil; Lee, Yongjae; Moon, A-Seong; Lee, Jaesung
- Abstract
Hypernym discovery is challenging because it aims to find suitable instances for a given hyponym from a predefined hypernym vocabulary. Existing hypernym discovery methods used supervised learning with word embedding from word2vec. However, word2vec embedding suffers from low embedding quality regarding unseen or rare noun phrases because entire noun phrases are embedded into a single vector. Recently, prompting methods have attempted to find hypernyms using pretrained language models with masked prompts. Although language models alleviate the problem of w embeddings, general-purpose language models are ineffective for capturing hypernym relationships. Considering the hypernym relationship to be a linguistic domain, we introduce Hypert, which is further pretrained using masked language modeling with Hearst pattern sentences. To the best of our knowledge, this is the first attempt in the hypernym relationship discovery field. We also present a fine-tuning strategy for training Hypert with special input prompts for the hypernym discovery task. The proposed method outperformed the comparison methods and achieved statistically significant results in three subtasks of hypernym discovery. Additionally, we demonstrate the effectiveness of the several proposed components through an in-depth analysis. The code is available at: https://github.com/Gun1Yun/Hypert.
- Subjects
LANGUAGE models; NATURAL language processing; NOUN phrases (Grammar)
- Publication
Journal of Big Data, 2023, Vol 10, Issue 1, p1
- ISSN
2196-1115
- Publication type
Article
- DOI
10.1186/s40537-023-00818-0