Skip to yearly menu bar Skip to main content


Poster

Breaking the Softmax Bottleneck: A High-Rank RNN Language Model

Zhilin Yang · Zihang Dai · Ruslan Salakhutdinov · William W Cohen

East Meeting level; 1,2,3 #36

Abstract:

We formulate language modeling as a matrix factorization problem, and show that the expressiveness of Softmax-based models (including the majority of neural language models) is limited by a Softmax bottleneck. Given that natural language is highly context-dependent, this further implies that in practice Softmax with distributed word embeddings does not have enough capacity to model natural language. We propose a simple and effective method to address this issue, and improve the state-of-the-art perplexities on Penn Treebank and WikiText-2 to 47.69 and 40.68 respectively. The proposed method also excels on the large-scale 1B Word dataset, outperforming the baseline by over 5.6 points in perplexity.

Live content is unavailable. Log in and register to view live content