SkipW: Resource Adaptable RNN with Strict Upper Computational Limit

Tsiry MAYET · Anne Lambert · Pascal Le Guyadec · Francoise Le Bolzer · Fran├žois Schnitzler


Keywords: [ Computational resources ] [ Flexibility ] [ recurrent neural networks ]

[ Abstract ]
[ Slides
[ Paper ]
Tue 4 May 1 a.m. PDT — 3 a.m. PDT


We introduce Skip-Window, a method to allow recurrent neural networks (RNNs) to trade off accuracy for computational cost during the analysis of a sequence. Similarly to existing approaches, Skip-Window extends existing RNN cells by adding a mechanism to encourage the model to process fewer inputs. Unlike existing approaches, Skip-Window is able to respect a strict computational budget, making this model more suitable for limited hardware. We evaluate this approach on two datasets: a human activity recognition task and adding task. Our results show that Skip-Window is able to exceed the accuracy of existing approaches for a lower computational cost while strictly limiting said cost.

Chat is not available.