![]() | Findings SIG: Publisher: Association for Computational Linguistics Note: Pages: 4329–4338 Language: URL: DOI: 10.18653/v1/2020.findings-emnlp.388 Bibkey: yan-etal-2020-bert Cite (ACL): Hang Yan, Xiaonan Li, Xipeng Qiu, and Bocao Deng. Anthology ID: 2020.findings-emnlp.388 Volume: Findings of the Association for Computational Linguistics: EMNLP 2020 Month: November Year: 2020 Address: Online Venues: EMNLP More importantly, mBERT can achieve remarkable cross-lingual reverse dictionary performance even without the parallel corpus, which means it can conduct the cross-lingual reverse dictionary with only corresponding monolingual data. Nevertheless, by using the Multilingual BERT (mBERT), we can efficiently conduct the cross-lingual reverse dictionary with one subword embedding, and the alignment between languages is not necessary. Previous models have to keep two different word embeddings and learn to align these embeddings. ![]() Besides, the cross-lingual reverse dictionary is the task to find the proper target word described in another language. We propose a simple but effective method to make BERT generate the target word for this specific task. ![]() However, since BERT is based on the byte-pair-encoding (BPE) subword encoding, it is nontrivial to make BERT generate a word given the description. In this paper, we tried to incorporate BERT into this task. ![]() Abstract Reverse dictionary is the task to find the proper target word given the word description.
0 Comments
Leave a Reply. |