Skip to yearly menu bar Skip to main content


Poster

Scaling Wearable Foundation Models

Girish Narayanswamy · Xin Liu · Kumar Ayush · Yuzhe Yang · Xuhai Xu · Shun Liao · Jake Garrison · Shyam Tailor · Jacob Sunshine · Yun Liu · Tim Althoff · Shrikanth Narayanan · Pushmeet Kohli · Jiening Zhan · Mark Malhotra · Shwetak Patel · Samy Abdel-Ghaffar · Daniel McDuff

Hall 3 + Hall 2B #18
[ ] [ Project Page ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

Wearable sensors have become ubiquitous thanks to a variety of health tracking features. The resulting continuous and longitudinal measurements from everyday life generate large volumes of data; however, making sense of these observations for scientific and actionable insights is non-trivial. Inspired by the empirical success of generative modeling, where large neural networks learn powerful representations from vast amounts of text, image, video, or audio data, we investigate the scaling properties of wearable sensor foundation models across compute, data, and model size. Using a dataset of up to 40 million hours of in-situ heart rate, heart rate variability, electrodermal activity, accelerometer, skin temperature, and altimeter per-minute data from over 165,000 people, we create LSM, a multimodal foundation model built on the largest wearable-signals dataset with the most extensive range of sensor modalities to date. Our results establish the scaling laws of LSM for tasks such as imputation, interpolation and extrapolation across both time and sensor modalities. Moreover, we highlight how LSM enables sample-efficient downstream learning for tasks including exercise and activity recognition.

Live content is unavailable. Log in and register to view live content