Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Temporal Dependencies in Feature Importance for Time Series Prediction

Kin Kwan Leung · Clayton Rooke · Jonathan Smith · Saba Zuberi · Maksims Volkovs

MH1-2-3-4 #138

Keywords: [ recurrent ] [ time series ] [ explainability ] [ Social Aspects of Machine Learning ]


Abstract:

Time series data introduces two key challenges for explainability methods: firstly, observations of the same feature over subsequent time steps are not independent, and secondly, the same feature can have varying importance to model predictions over time. In this paper, we propose Windowed Feature Importance in Time (WinIT), a feature removal based explainability approach to address these issues. Unlike existing feature removal explanation methods, WinIT explicitly accounts for the temporal dependence between different observations of the same feature in the construction of its importance score. Furthermore, WinIT captures the varying importance of a feature over time, by summarizing its importance over a window of past time steps. We conduct an extensive empirical study on synthetic and real-world data, compare against a wide range of leading explainability methods, and explore the impact of various evaluation strategies. Our results show that WinIT achieves significant gains over existing methods, with more consistent performance across different evaluation metrics.

Chat is not available.