Skip to yearly menu bar Skip to main content


Paper Teaser
in
Workshop: Emergent Communication: New Frontiers

Emergent Communication Fine-tuning (EC-FT) for Pretrained Language Models

Shane Steinert-Threlkeld · Xuhui Zhou · Zeyu Liu · C. Downey

Keywords: [ machine translation ] [ emergent communication ] [ Fine-tuning ]


Abstract:

It has recently been argued that the currently dominant paradigm in NLP of pretraining on text-only corpora will not yield robust natural language understanding systems. One strain of this argumentation highlights the need for grounded, goal-oriented, and interactive language learning. In this position paper, we articulate how Emergent Communication (EC) can be used in conjunction with large pretrained language models as a `Fine-Tuning' (FT) step (hence, EC-FT) in order to provide them with supervision from such learning scenarios. We discuss methodological issues and difficulties with making this work, and then illustrate the overall idea with a case study in unsupervised machine translation, before concluding with a discussion on the relation to multimodal pretraining.

Chat is not available.