A multimodal stress detection dataset with facial expressions and physiological signals
Hosseini, Majid; Sohrab, Fahad; Gottumukkala, Raju; Bhupatiraju, Ravi Teja; Katragadda, Satya; Raitoharju, Jenni; Iosifidis, Alexandros; Gabbouj, Moncef (2025)
Hosseini, Majid
Sohrab, Fahad
Gottumukkala, Raju
Bhupatiraju, Ravi Teja
Katragadda, Satya
Raitoharju, Jenni
Iosifidis, Alexandros
Gabbouj, Moncef
2025
Scientific Data
1844
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tuni-2025121611772
https://urn.fi/URN:NBN:fi:tuni-2025121611772
Kuvaus
Peer reviewed
Tiivistelmä
Affective computing has garnered the attention and interest of researchers in recent years, as there is a need for AI systems to better understand and react to human emotions. However, analyzing human emotions, such as mood or stress, is quite complex. While various stress studies use facial expressions and wearables, most existing datasets rely on processing data from a single modality. This paper presents EmpathicSchool, a novel dataset that captures facial expressions and the associated physiological signals, such as heart rate, electrodermal activity, and skin temperature, under different stress levels. The data was collected from 30 participants during different sessions for about ninety minutes each (for a total of 40 hours). The data includes seven different signal types, including both computer vision and physiological features that can be used to detect stress. In addition, various experiments were conducted to validate the signal quality.
Kokoelmat
- TUNICRIS-julkaisut [24199]
