Poster
Improved Algorithms for Kernel Matrix-Vector Multiplication Under Sparsity Assumptions
Piotr Indyk · Michael Kapralov · Kshiteej Jitesh Sheth · Tal Wagner
Hall 3 + Hall 2B #146
[
Abstract
]
Sat 26 Apr midnight PDT
— 2:30 a.m. PDT
Abstract:
Motivated by the problem of fast processing of attention matrices, we study fast algorithms for computing matrix-vector products for asymmetric Gaussian Kernel matrices K∈Rn×n. K's columns are indexed by a set of n keys k1,k2…,kn∈Rd, rows by a set of n queries q1,q2,…,qn∈Rd, and its i,j entry is Kij=e−‖qi−kj‖22/2σ2 for some bandwidth parameter σ>0. Given a vector x∈Rn and error parameter ϵ>0, our task is to output a y∈Rn such that ‖Kx−y‖2≤ϵ‖x‖2 in time subquadratic in n and linear in d. Our algorithms rely on the following modelling assumption about the matrices K: the sum of the entries of K scales linearly in n, as opposed to worst case quadratic growth. We validate this assumption experimentally, for Gaussian kernel matrices encountered in various settings such as fast attention computation in LLMs. Under this assumption, we obtain the first subquadratic time algorithm for kernel matrix-vector multiplication for unrestricted vectors.
Live content is unavailable. Log in and register to view live content