TY - JOUR
T1 - Multimodal Ambulatory Sleep Detection Using LSTM Recurrent Neural Networks
AU - Sano, Akane
AU - Chen, Weixuan
AU - Lopez-Martinez, Daniel
AU - Taylor, Sara
AU - Picard, Rosalind W.
N1 - Funding Information:
Manuscript received January 21, 2018; revised May 15, 2018 and July 18, 2018; accepted August 18, 2018. Date of publication September 14, 2018; date of current version July 1, 2019. This work was supported in part by NIH under Grant R01GM105018, in part by Samsung Electronics, in part by NEC, and in part by MIT Media Lab Consortium. (Akane Sano and Weixuan Chen contributed equally to this work.) (Corresponding author: Akane Sano.) A. Sano is with the Department of Electrical and Computer Engineering, Rice University, Houston, TX 77005 USA (e-mail:, Akane.Sano@ rice.edu).
Publisher Copyright:
© 2013 IEEE.
PY - 2019/7
Y1 - 2019/7
N2 - Unobtrusive and accurate ambulatory methods are needed to monitor long-term sleep patterns for improving health. Previously developed ambulatory sleep detection methods rely either in whole or in part on self-reported diary data as ground truth, which is a problem, since people often do not fill them out accurately. This paper presents an algorithm that uses multimodal data from smartphones and wearable technologies to detect sleep/wake state and sleep onset/offset using a type of recurrent neural network with long-short-term memory (LSTM) cells for synthesizing temporal information. We collected 5580 days of multimodal data from 186 participants and compared the new method for sleep/wake classification and sleep onset/offset detection to, first, nontemporal machine learning methods and, second, a state-of-the-art actigraphy software. The new LSTM method achieved a sleep/wake classification accuracy of 96.5%, and sleep onset/offset detection F1 scores of 0.86 and 0.84, respectively, with mean absolute errors of 5.0 and 5.5 min, res-pectively, when compared with sleep/wake state and sleep onset/offset assessed using actigraphy and sleep diaries. The LSTM results were statistically superior to those from nontemporal machine learning algorithms and the actigraphy software. We show good generalization of the new algorithm by comparing participant-dependent and participant-independent models, and we show how to make the model nearly realtime with slightly reduced performance.
AB - Unobtrusive and accurate ambulatory methods are needed to monitor long-term sleep patterns for improving health. Previously developed ambulatory sleep detection methods rely either in whole or in part on self-reported diary data as ground truth, which is a problem, since people often do not fill them out accurately. This paper presents an algorithm that uses multimodal data from smartphones and wearable technologies to detect sleep/wake state and sleep onset/offset using a type of recurrent neural network with long-short-term memory (LSTM) cells for synthesizing temporal information. We collected 5580 days of multimodal data from 186 participants and compared the new method for sleep/wake classification and sleep onset/offset detection to, first, nontemporal machine learning methods and, second, a state-of-the-art actigraphy software. The new LSTM method achieved a sleep/wake classification accuracy of 96.5%, and sleep onset/offset detection F1 scores of 0.86 and 0.84, respectively, with mean absolute errors of 5.0 and 5.5 min, res-pectively, when compared with sleep/wake state and sleep onset/offset assessed using actigraphy and sleep diaries. The LSTM results were statistically superior to those from nontemporal machine learning algorithms and the actigraphy software. We show good generalization of the new algorithm by comparing participant-dependent and participant-independent models, and we show how to make the model nearly realtime with slightly reduced performance.
KW - LSTM
KW - Sleep monitoring
KW - long-short-term memory
KW - mobile health
KW - mobile phone
KW - recurrent neural networks
KW - sleep detection
KW - smartphone
KW - wearable sensor
UR - http://www.scopus.com/inward/record.url?scp=85052620184&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85052620184&partnerID=8YFLogxK
U2 - 10.1109/JBHI.2018.2867619
DO - 10.1109/JBHI.2018.2867619
M3 - Article
C2 - 30176613
AN - SCOPUS:85052620184
SN - 2168-2194
VL - 23
SP - 1607
EP - 1617
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
IS - 4
M1 - 8449917
ER -