Skip to yearly menu bar Skip to main content


Poster

Understanding Composition of Word Embeddings via Tensor Decomposition

Abraham Frandsen · Rong Ge

Great Hall BC #30

Keywords: [ word embeddings ] [ semantic composition ] [ tensor decomposition ]


Abstract:

Word embedding is a powerful tool in natural language processing. In this paper we consider the problem of word embedding composition --- given vector representations of two words, compute a vector for the entire phrase. We give a generative model that can capture specific syntactic relations between words. Under our model, we prove that the correlations between three words (measured by their PMI) form a tensor that has an approximate low rank Tucker decomposition. The result of the Tucker decomposition gives the word embeddings as well as a core tensor, which can be used to produce better compositions of the word embeddings. We also complement our theoretical results with experiments that verify our assumptions, and demonstrate the effectiveness of the new composition method.

Live content is unavailable. Log in and register to view live content