Skip to yearly menu bar Skip to main content

Contributed Talk
Workshop: Neural Compression: From Information Theory to Applications

Spotlight 1: Lucas Theis & Aaron Wagner, A coding theorem for the rate-distortion-perception function


The rate-distortion-perception function (RDPF; Blau and Michaeli, 2019) has emerged as a useful tool for thinking about realism and distortion of reconstructions in lossy compression. Unlike the rate-distortion function, however, it is unknown whether encoders and decoders exist that achieve the rate suggested by the RDPF. Building on results by Li and El Gamal (2018), we show that the RDPF can indeed be achieved using stochastic, variable-length codecs. For this class of codecs, we also prove that the RDPF lower-bounds the achievable rate.