Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Quantized Compressed Sensing with Score-Based Generative Models

Xiangming Meng · Yoshiyuki Kabashima

Keywords: [ quantization ] [ linear inverse problems ] [ generative models ] [ compressed sensing ] [ Generative models ]


Abstract: We consider the general problem of recovering a high-dimensional signal from noisy quantized measurements. Quantization, especially coarse quantization such as 1-bit sign measurements, leads to severe information loss and thus a good prior knowledge of the unknown signal is helpful for accurate recovery. Motivated by the power of score-based generative models (SGM, also known as diffusion models) in capturing the rich structure of natural signals beyond simple sparsity, we propose an unsupervised data-driven approach called quantized compressed sensing with SGM (QCS-SGM), where the prior distribution is modeled by a pre-trained SGM. To perform posterior sampling, an annealed pseudo-likelihood score called ${\textit{noise perturbed pseudo-likelihood score}}$ is introduced and combined with the prior score of SGM. The proposed QCS-SGM applies to an arbitrary number of quantization bits. Experiments on a variety of baseline datasets demonstrate that the proposed QCS-SGM significantly outperforms existing state-of-the-art algorithms by a large margin for both in-distribution and out-of-distribution samples. Moreover, as a posterior sampling method, QCS-SGM can be easily used to obtain confidence intervals or uncertainty estimates of the reconstructed results. $\textit{The code is available at}$ https://github.com/mengxiangming/QCS-SGM.

Chat is not available.