LORD: Lower-Dimensional Embedding of Log-Signature in Neural Rough Differential Equations

Jaehoon Lee · Jeon Jinsung · Sheo yon Jhin · Jihyeon Hyeong · Jayoung Kim · Minju Jo · Kook Seungji · Noseong Park

[ Abstract ]
[ Visit Poster at Spot G0 in Virtual World ] [ OpenReview
Thu 28 Apr 2:30 a.m. PDT — 4:30 a.m. PDT


The problem of processing very long time-series data (e.g., a length of more than 10,000) is a long-standing research problem in machine learning. Recently, one breakthrough, called neural rough differential equations (NRDEs), has been proposed and has shown that it is able to process such data. Their main concept is to use the log-signature transform, which is known to be more efficient than the Fourier transform for irregular long time-series, to convert a very long time-series sample into a relatively shorter series of feature vectors. However, the log-signature transform causes non-trivial spatial overheads. To this end, we present the method of LOweR-Dimensional embedding of log-signature (LORD), where we define an NRDE-based autoencoder to implant the higher-depth log-signature knowledge into the lower-depth log-signature. We show that the encoder successfully combines the higher-depth and the lower-depth log-signature knowledge, which greatly stabilizes the training process and increases the model accuracy. In our experiments with benchmark datasets, the improvement ratio by our method is up to 75\% in terms of various classification and forecasting evaluation metrics.

Chat is not available.