Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Improved Algorithms for Kernel Matrix-Vector Multiplication Under Sparsity Assumptions

Piotr Indyk · Michael Kapralov · Kshiteej Jitesh Sheth · Tal Wagner

Hall 3 + Hall 2B #146
[ ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract: Motivated by the problem of fast processing of attention matrices, we study fast algorithms for computing matrix-vector products for asymmetric Gaussian Kernel matrices KRn×n. K's columns are indexed by a set of n keys k1,k2,knRd, rows by a set of n queries q1,q2,,qnRd, and its i,j entry is Kij=eqikj22/2σ2 for some bandwidth parameter σ>0. Given a vector xRn and error parameter ϵ>0, our task is to output a yRn such that Kxy2ϵx2 in time subquadratic in n and linear in d. Our algorithms rely on the following modelling assumption about the matrices K: the sum of the entries of K scales linearly in n, as opposed to worst case quadratic growth. We validate this assumption experimentally, for Gaussian kernel matrices encountered in various settings such as fast attention computation in LLMs. Under this assumption, we obtain the first subquadratic time algorithm for kernel matrix-vector multiplication for unrestricted vectors.

Live content is unavailable. Log in and register to view live content