We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Multilingual Transformers for Named Entity Recognition.
- Authors
VĪKSNA, Rinalds; SKADIŅA, Inguna
- Abstract
Different methods for automatic named entity recognition (NER) have been researched for many years. Today, the most common technique for training named entity recognition models is a fine-tuning of large pre-trained language models. In this paper, we investigate the performance of various multilingual NER models in the state-of-the-art natural language processing framework Flair and compare them against the multilingual NER solution of the MAPA anonymization toolkit and BERT multilingual model, fine-tuned for NER. We demonstrate that in multilingual settings the best results could be achieved with fine-tuned XLM-R model, while in the case of Latvian (monolingual settings), the more targeted LitLat BERT model leads to the best results.
- Subjects
LANGUAGE &; languages
- Publication
Baltic Journal of Modern Computing, 2022, Vol 10, Issue 3, p457
- ISSN
2255-8942
- Publication type
Article
- DOI
10.22364/bjmc.2022.10.3.18