Skip to yearly menu bar Skip to main content


Virtual presentation / top 25% paper

Self-Guided Noise-Free Data Generation for Efficient Zero-Shot Learning

JIAHUI GAO · Renjie Pi · LIN Yong · Hang Xu · Jiacheng Ye · Zhiyong Wu · WEIZHONG ZHANG · Xiaodan Liang · Zhenguo Li · Lingpeng Kong

Keywords: [ prompt-based learning ] [ Efficient Zero-Shot Learning ] [ Pre-Trained Language Model ] [ Applications ]


Abstract:

There is a rising interest in further exploring the zero-shot learning potential of large pre-trained language models (PLMs). A new paradigm called data-generation-based zero-shot learning has achieved impressive success. In this paradigm, the synthesized data from the PLM acts as the carrier of knowledge, which is used to train a task-specific model with orders of magnitude fewer parameters than the PLM, achieving both higher performance and efficiency than prompt-based zero-shot learning methods on PLMs. The main hurdle of this approach is that the synthesized data from PLM usually contains a significant portion of low-quality samples. Fitting on such data will greatly hamper the performance of the task-specific model, making it unreliable for deployment. Previous methods remedy this issue mainly by filtering synthetic data using heuristic metrics(e.g., output confidence), or refining the data with the help of a human expert, which comes with excessive manual tuning or expensive costs. In this paper, we propose a novel noise-robust re-weighting framework SunGen to automatically construct high-quality data for zero-shot classification problems. Our framework features the ability to learn the sample weights indicating data quality without requiring any human annotation. We theoretically and empirically verify the ability of our method to help construct good-quality synthetic datasets. Notably, SunGen-LSTM yields a 9.8% relative improvement than the baseline on average accuracy across eight different established text classification tasks.

Chat is not available.