Scaling Wearable Foundation Models
Foundation (evidence)
DOI:
10.48550/arxiv.2410.13638
Publication Date:
2024-10-17
AUTHORS (18)
ABSTRACT
Wearable sensors have become ubiquitous thanks to a variety of health tracking features. The resulting continuous and longitudinal measurements from everyday life generate large volumes data; however, making sense these observations for scientific actionable insights is non-trivial. Inspired by the empirical success generative modeling, where neural networks learn powerful representations vast amounts text, image, video, or audio data, we investigate scaling properties sensor foundation models across compute, model size. Using dataset up 40 million hours in-situ heart rate, rate variability, electrodermal activity, accelerometer, skin temperature, altimeter per-minute data over 165,000 people, create LSM, multimodal built on largest wearable-signals with most extensive range modalities date. Our results establish laws LSM tasks such as imputation, interpolation extrapolation, both time modalities. Moreover, highlight how enables sample-efficient downstream learning like exercise activity recognition.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....