Skip to yearly menu bar Skip to main content


Contributed Talk
in
Workshop: Neural Compression: From Information Theory to Applications

Spotlight 9: George Zhang et al., Universal Rate-Distortion-Perception Representations for Lossy Compression


Abstract:

In the context of lossy compression, \citet{blau2019rethinking} adopt a mathematical notion of perceptual quality defined in terms of a distributional constraint and characterize the three-way tradeoff between rate, distortion and perception, generalizing the classical rate-distortion tradeoff. Within this rate-distortion-perception framework, we consider the notion of (approximately) universal representations in which one may fix an encoder and vary the decoder to (approximately) achieve any point along the perception-distortion tradeoff. We show that the penalty for fixing the encoder is zero in the Gaussian case, and give bounds in the case of arbitrary distributions. In principle, a small penalty refutes the need to design an end-to-end system for each particular objective. We provide experimental results on MNIST and SVHN to show that there exist practical constructions that suffer only a small penalty, i.e. machine learning models learn representation maps which are approximately universal within their operational capacities.