ICLR 2018
Skip to yearly menu bar Skip to main content


Workshop

Learning Efficient Tensor Representations with Ring Structure Networks

Qibin Zhao · Masashi Sugiyama · Longhao Yuan · Andrzej Cichocki

East Meeting Level 8 + 15 #13

\emph{Tensor train (TT) decomposition} is a powerful representation for high-order tensors, which has been successfully applied to various machine learning tasks in recent years. In this paper, we propose a more generalized tensor decomposition with ring structure network by employing circular multilinear products over a sequence of lower-order core tensors, which is termed as TR representation. Several learning algorithms including blockwise ALS with adaptive tensor ranks and SGD with high scalability are presented. Furthermore, the mathematical properties are investigated, which enables us to perform basic algebra operations in a computationally efficiently way by using TR representations. Experimental results on synthetic signals and real-world datasets demonstrate the effectiveness of TR model and the learning algorithms. In particular, we show that the structure information and high-order correlations within a 2D image can be captured efficiently by employing tensorization and TR representation.

Live content is unavailable. Log in and register to view live content