Skip to yearly menu bar Skip to main content

Spotlight Poster

Finite-State Autoregressive Entropy Coding for Efficient Learned Lossless Compression

Yufeng Zhang · Hang Yu · Jianguo Li · Weiyao Lin

Halle B #243
[ ]
Thu 9 May 7:30 a.m. PDT — 9:30 a.m. PDT


Learned lossless data compression has garnered significant attention recently due to its superior compression ratios compared to traditional compressors. However, the computational efficiency of these models jeopardizes their practicality. This paper proposes a novel system for improving the compression ratio while maintaining computational efficiency for learned lossless data compression. Our approach incorporates two essential innovations. First, we propose the Finite-State AutoRegressive (FSAR) entropy coder, an efficient autoregressive Markov model based entropy coder that utilizes a lookup table to expedite autoregressive entropy coding. Next, we present a Straight-Through Hardmax Quantization (STHQ) scheme to enhance the optimization of discrete latent space. Our experiments show that the proposed lossless compression method could improve the compression ratio by up to 6\% compared to the baseline, with negligible extra computational time. Our work provides valuable insights into enhancing the computational efficiency of learned lossless data compression, which can have practical applications in various fields. Code is available at

Chat is not available.