Poster
Effective Distributed Learning with Random Features: Improved Bounds and Algorithms
Yong Liu · Jiankun Liu · Shuqiang Wang
Keywords: [ kernel methods ] [ statistical learning theory ] [ Risk bound ]
Abstract:
In this paper, we study the statistical properties of distributed kernel ridge regression together with random features (DKRR-RF), and obtain optimal generalization bounds under the basic setting, which can substantially relax the restriction on the number of local machines in the existing state-of-art bounds. Specifically, we first show that the simple combination of divide-and-conquer technique and random features can achieve the same statistical accuracy as the exact KRR in expectation requiring only memory and time. Then, beyond the generalization bounds in expectation that demonstrate the average information for multiple trails, we derive generalization bounds in probability to capture the learning performance for a single trail. Finally, we propose an effective communication strategy to further improve the performance of DKRR-RF, and validate the theoretical bounds via numerical experiments.
Chat is not available.