Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: Machine Learning Multiscale Processes

A Joint Space-Time Encoder for Geographic Time-Series Data

David Mickisch · Konstantin Klemmer · Mélisande Teng · David Rolnick

Keywords: [ location encodings ] [ geographic time series ] [ surrogate modelling ] [ deep learning regularization ] [ climate and weather ]


Abstract:

Many real-world processes are characterized by complex spatio-temporal dependencies, from climate dynamics to disease spread. Here, we introduce a new neural network architecture to model such dynamics at scale: the \emph{Space-Time Encoder}. Building on recent advances in \emph{location encoders}, models that take as inputs geographic coordinates, we develop a method that takes in geographic and temporal information simultaneously and learns smooth, continuous functions in both space and time. The inputs are first transformed using positional encoding functions and then fed into neural networks that allow the learning of complex functions. We implement a prototype of the \emph{Space-Time Encoder}, discuss the design choices of the novel temporal encoding, and demonstrate its utility in climate model emulation. We discuss the potential of the method across use cases, as well as promising avenues for further methodological innovation.

Chat is not available.