ICLR 2018
Skip to yearly menu bar Skip to main content


Workshop

PixelSNAIL: An Improved Autoregressive Generative Model

Xi Chen · Nikhil Mishra · Mostafa Rohaninejad · Pieter Abbeel

East Meeting Level 8 + 15 #13

Autoregressive generative models achieve the best results in density estimation tasks involving high dimensional data, such as images or audio. They pose density estimation as a sequence modeling task, where a recurrent neural network (RNN) models the conditional distribution over the next element conditioned on all previous elements. In this paradigm, the bottleneck is the extent to which the RNN can model long-range dependencies, and the most successful approaches rely on causal convolutions. Taking inspiration from recent work in meta reinforcement learning, where dealing with long-range dependencies is also essential, we introduce a new generative model architecture that combines causal convolutions with self attention. In this paper, we describe the resulting model and present state-of-the-art log-likelihood results on heavily benchmarked datasets: CIFAR-10 (2.85 bits per dim), $32 \times 32$ ImageNet (3.80 bits per dim) and $64 \times 64$ ImageNet (3.52 bits per dim). Our implementation is publicly available at \url{https://github.com/neocxi/pixelsnail-public}.

Live content is unavailable. Log in and register to view live content