Poster
in
Workshop: Machine Learning for Remote Sensing (ML4RS)
Encoding Agent Trajectories as Representations with Sequence Transformers
Athanasios Tsiligkaridis · Zhongheng Li · Elizabeth Hou
Spatiotemporal data faces many analogous challenges to natural language text including the ordering of locations (words) in a sequence, long range dependencies between locations, and locations having multiple meanings. In this work, we propose a novel model for representing high dimensional spatiotemporal trajectories as sequences of discrete locations and encoding them with a Transformer-based neural network architecture. Similar to language models, our Sequence Transformer for Agent Representation Encodings (STARE) model learns these encoding through the supervisory signal of various tasks, e.g. classification of agent trajectories. We present experimental results on various synthetic and real trajectory datasets and show that our proposed STARE model can correctly learn labels along with meaningful encodings.