Skip to yearly menu bar Skip to main content


Contributed Talk
in
Workshop: Neural Compression: From Information Theory to Applications

Spotlight 2: Yibo Yang and Stephan Mandt, Lower Bounding Rate-Distortion From Samples


Abstract: The rate-distortion function $R(D)$ tells us the minimal number of bits on average to compress a random object within a given distortion tolerance. A lower bound on $R(D)$ therefore represents a fundamental limit on the best possible rate-distortion performance of any lossy compression algorithm, and can help us assess the potential room for improvement. We make a first attempt at an algorithm for computing such a lower bound, applicable to any data source that we have samples of. Based on a dual characterization of $R(D)$ in terms of a constrained maximization problem, our method approximates the exact constraint function by an asymptotically unbiased sample-based estimator, allowing for stochastic optimization. On a 2D Gaussian source, we obtain a lower bound within 1 bit of the true $R(D)$ on average.