Skip to yearly menu bar Skip to main content


Contributed Talk
in
Workshop: Neural Compression: From Information Theory to Applications

Spotlight 10: Leonhard Helminger et al., Lossy Image Compression with Normalizing Flows


Abstract:

Deep learning based image compression has recently witnessed exciting progress and in some cases even managed to surpass transform coding based approaches. However, state-of-the-art solutions for deep image compression typically employ autoencoders which map the input to a lower dimensional latent space and thus irreversibly discard information already before quantization. In contrast, traditional approaches in image employ an invertible transformation before performing the quantization step. Inspired by this, we propose a deep image compression method that is able to go from low bit-rates to near lossless quality by leveraging normalizing flows to learn a bijective mapping from the image space to a latent representation. We demonstrate further advantages unique to our solution, such as the ability to maintain constant quality results through re-encoding, even when performed multiple times. To the best of our knowledge, this is the first work leveraging normalizing flows for lossy image compression.