Skip to yearly menu bar Skip to main content


Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections

Csaba Toth · Patric Bonnier · Harald Oberhauser

Keywords: [ generative modelling ] [ low-rank tensors ] [ sequential data ] [ time series ] [ classification ] [ representation learning ]


Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies. At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning. We use a classical mathematical object -- the free algebra -- to capture this non-commutativity. To address the innate computational complexity of this algebra, we use compositions of low-rank tensor projections. This yields modular and scalable building blocks that give state-of-the-art performance on standard benchmarks such as multivariate time series classification, mortality prediction and generative models for video.

Chat is not available.