Contrastive Pretraining for Stress Detection with Multimodal Wearable Sensor Data and Surveys

Zeyu Yang, Han Yu, Akane Sano

Research output: Contribution to journalConference articlepeer-review

Abstract

Stress adversely affects mental and physical health and underscores the importance of early detection. Some studies have utilized physiological signals from wearable sensors and other information to monitor stress levels in daily life. Recent studies use self-supervised methods due to the high cost of collecting stress labels. However, self-supervised learning using both time series and tabular features such as demographics, traits, and contextual information has been understudied. Therefore, there is a need to further investigate how a model can be effectively trained with different granularity of multimodal data and a limited number of labels. In this study, we introduce a self-supervised multimodal learning approach for stress detection that combines time series and tabular features. Our proposed method presents a promising solution for effectively monitoring stress using multimodal data.

Original languageEnglish (US)
JournalProceedings of Machine Learning Research
Volume287
StatePublished - 2025
Event6th Conference on Health, Inference, and Learning, CHIL 2025 - Berkeley, United States
Duration: Jun 25 2025Jun 27 2025

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Contrastive Pretraining for Stress Detection with Multimodal Wearable Sensor Data and Surveys'. Together they form a unique fingerprint.

Cite this