"Generative models which use explicit density modeling (e.g., variational autoencoders, flow-based generative models) often involve finding the optimal mapping (i.e., transfer operator) from a known distribution, e.g. Gaussian, to the input (unknown) distribution. This often requires searching over a class of non-linear functions (e.g. functions that can be represented by a deep neural network). While effective in practice, the associated computational/memory costs can increase rapidly, usually as a function of the performance that is desired in an application. We propose a substantially cheaper (and simpler) distribution matching strategy by leveraging recent developments in neural kernels together with ideas from known results on kernel transfer operators. We show that our formulation enables highly efficient distribution approximation and sampling, and offers empirical performance that compares very favorably with powerful baselines, but with significant savings in runtime. We show that the algorithm also performs well in the small sample size settings. "