We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Narrowing the language gap: domain adaptation guided cross-lingual passage re-ranking.
- Authors
Chen, Dongmei; Zhang, Xin; Zhang, Sheng
- Abstract
For a given query, the objective of Cross-lingual Passage Re-ranking (XPR) is to rank a list of candidate passages in multiple languages, where only a portion of the passages are in the query's language. Multilingual BERT (mBERT) is often used for the XPR task and achieves impressive performance. Nevertheless, there still exist two essential issues to be addressed in mBERT, including the performance gap between high- and low-resource languages, and the lack of explicit embedding distribution alignment. Regarding each language as a separated domain, we theoretically explore how these problems lead to errors in XPR under the guidance of domain adaptation. Based on the theoretical analysis, we propose a novel framework that comprises two modules, namely knowledge distillation and adversarial learning. The former enables the knowledge to be transferred from high-resource languages to low-resource ones, narrowing their performance gap. The latter encourages mBERT to align the embedding distributions across different languages by utilizing a novel language-distinguish task and adversarial training. Extensive experiments on in-domain and out-domain datasets confirm the effectiveness and robustness of the proposed framework and show that it can outperform state-of-the-art methods.
- Subjects
KNOWLEDGE transfer; LANGUAGE &; languages; PHYSIOLOGICAL adaptation
- Publication
Neural Computing & Applications, 2023, Vol 35, Issue 28, p20735
- ISSN
0941-0643
- Publication type
Article
- DOI
10.1007/s00521-023-08803-7