Poster
Fast Regression for Structured Inputs
Raphael Meyer · Cameron Musco · Christopher Musco · David Woodruff · Samson Zhou
Keywords: [ regression ]
Abstract:
We study the ℓp regression problem, which requires finding x∈Rd that minimizes ‖Ax−b‖p for a matrix A∈Rn×d and response vector b∈Rn. There has been recent interest in developing subsampling methods for this problem that can outperform standard techniques when n is very large. However, all known subsampling approaches have run time that depends exponentially on p, typically, dO(p), which can be prohibitively expensive. We improve on this work by showing that for a large class of common \emph{structured matrices}, such as combinations of low-rank matrices, sparse matrices, and Vandermonde matrices, there are subsampling based methods for ℓp regression that depend polynomially on p. For example, we give an algorithm for ℓp regression on Vandermonde matrices that runs in time O(nlog3n+(dp2)0.5+ω⋅polylogn), where ω is the exponent of matrix multiplication. The polynomial dependence on p crucially allows our algorithms to extend naturally to efficient algorithms for ℓ∞ regression, via approximation of ℓ∞ by ℓO(logn). Of practical interest, we also develop a new subsampling algorithm for ℓp regression for arbitrary matrices, which is simpler than previous approaches for p≥4.
Chat is not available.