Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Matrix Product Sketching via Coordinated Sampling

Majid Daliri · Juliana Freire · Danrong Li · Christopher Musco

Hall 3 + Hall 2B #457
[ ]
Wed 23 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract: We revisit the well-studied problem of approximating a matrix product, \bvAT\bvB, based on small space sketches S(\bvA) and S(\bvB) of \bvA\Rn×d and \bvB\Rn×m. We are interested in the setting where the sketches must be computed independently of each other, except for the use of a shared random seed. We prove that, when \bvA and \bvB are sparse, methods based on \emph{coordinated random sampling} can outperform classical linear sketching approaches, like Johnson-Lindenstrauss Projection or CountSketch. For example, to obtain Frobenius norm error ϵ\bvAF\bvBF, coordinated sampling requires sketches of size O(s/ϵ2) when \bvA and \bvB have at most sd,m non-zeros per row. In contrast, linear sketching leads to sketches of size O(d/ϵ2) and O(m/ϵ2) for \bvA and \bvB. We empirically evaluate our approach on two applications: 1) distributed linear regression in databases, a problem motivated by tasks like dataset discovery and augmentation, and 2) approximating attention matrices in transformer-based language models. In both cases, our sampling algorithms yield an order of magnitude improvement over linear sketching.

Live content is unavailable. Log in and register to view live content