Skip to yearly menu bar Skip to main content


Poster

InstaTrain: Adaptive Training via Ultra-Fast Natural Annealing within Dynamical Systems

Chuan Liu · Ruibing Song · Chunshu Wu · Pouya Haghi · Tong Geng

Hall 3 + Hall 2B #455
[ ]
Fri 25 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

Time-series modeling is broadly adopted to capture underlying patterns present in historical data, allowing prediction of future values. However, one crucial aspect of such modeling is often overlooked: in highly dynamic environments, data distributions can shift drastically within a second or less. Under these circumstances, traditional predictive models, and even online learning methods, struggle to adapt to the ultra-fast and complex distribution shifts present in highly dynamic scenarios. To address this, we propose InstaTrain, a novel learning approach that enables ultra-fast model updates for real-world prediction tasks, thereby keeping pace with rapidly evolving data distributions. In this work, (1) we transform the slow and expensive training process into an ultra-fast natural annealing process within a dynamical system. (2) Leveraging a recently proposed electronic dynamical system, we augment the system with parameter update modules, extending its capabilities to encompass both rapid training and inference. Experimental results on highly dynamic datasets demonstrate that our method achieves orders-of-magnitude improvements in training speed and energy efficiency while delivering superior accuracy compared to baselines running on GPUs.

Live content is unavailable. Log in and register to view live content