Abstract:
Emotions play a significant role in human-computer interaction and1entertainment consumption behavior, which young adults commonly use.1The main challenge is the lack of a publicly available dataset for young1adults with emotion labeling of physiological signals. This article presents1a multi-modal data set of Electrocardiograms (ECG) and Galvanic Skin1Response (GSR) signals for the emotion classification of young adults.1Signal acquisition was performed through Shimmer3 ECG and Shimmer31GSR units wearable to the chest and palm of the participants. The ECG1signals were acquired from 25 participants, while GSR signals were1acquired from 12 participants while watching 21 emotional stimulus1videos divided into three sessions. The data was self-annotated for seven1emotions: happy, sad, fear, surprise, anger, disgust, and neutral. These1emotional states were further self-annotated with five very low, low,1moderate, high, and very high-intensity levels of felt emotion. The1participant also annotated valence, arousal, and dominance scores through1Google form against each provided stimulus. The base experimental1results for classifying four classes of high valence high arousal (HVHA),1high valence low arousal (HVLA), low valence high arousal (LVHA), and1low valence low arousal for ECG data is reported with an accuracy of169.66%. Our baseline method for the proposed dataset achieved 66.64%1accuracy for the eight-class classification of categorical emotions. The1significance of data lies in the more emotional classes and less intrusive1sensors to mimic real-world applications.
Page(s):
1-1
DOI:
DOI not available
Published:
Journal: IEEE International Conference on Digital Futures and Transformative Technologies (ICoDT2) May 24-26, 2022 (Book of Abstracts), Volume: 1, Issue: 1, Year: 2022