Skip to yearly menu bar Skip to main content


Poster

A Temporal Kernel Approach for Deep Learning with Continuous-time Information

Da Xu · Chuanwei Ruan · evren korpeoglu · Sushant Kumar · kannan achan

Keywords: [ Reparameterization ] [ Random Feature ] [ Spectral Distribution ] [ Continuous-time System ] [ kernel learning ] [ learning theory ]


Abstract:

Sequential deep learning models such as RNN, causal CNN and attention mechanism do not readily consume continuous-time information. Discretizing the temporal data, as we show, causes inconsistency even for simple continuous-time processes. Current approaches often handle time in a heuristic manner to be consistent with the existing deep learning architectures and implementations. In this paper, we provide a principled way to characterize continuous-time systems using deep learning tools. Notably, the proposed approach applies to all the major deep learning architectures and requires little modifications to the implementation. The critical insight is to represent the continuous-time system by composing neural networks with a temporal kernel, where we gain our intuition from the recent advancements in understanding deep learning with Gaussian process and neural tangent kernel. To represent the temporal kernel, we introduce the random feature approach and convert the kernel learning problem to spectral density estimation under reparameterization. We further prove the convergence and consistency results even when the temporal kernel is non-stationary, and the spectral density is misspecified. The simulations and real-data experiments demonstrate the empirical effectiveness of our temporal kernel approach in a broad range of settings.

Chat is not available.