Skip to yearly menu bar Skip to main content


Virtual oral
in
Affinity Workshop: Tiny Papers Showcase Day (a DEI initiative)

MetaXLR - Mixed Language Meta Representation Transformation for Low-resource Cross-lingual Learning based on Multi-Armed Bandit

Liat Bezalel · Eyal Orgad


Abstract:

Transfer learning for extremely low-resource languages is a challenging task as there is no large-scale monolingual corpora for pre-training or sufficient annotated data for fine-tuning. We follow the work of (Xia et al., 2021) which suggests using meta learning for transfer learning from a single source language to an extremely low resource one. We propose an enhanced approach which uses multiple source languages chosen in a data-driven manner. In addition, we introduce a sample selection strategy for utilizing the languages in training by using a multi armed bandit algorithm. Using both of these improvements we managed to achieve state-of-the-art results on the NER task for the extremely low resource languages even with the same amount of data, making the representations better generalized. Also, due to the method’s ability to use multiple languages it allows the framework to use much larger amounts of data, while still having superior results over the former MetaXL method even with the same amounts of data.

Chat is not available.