Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns

The following research describes the potential of classifying a four class emotion using a wearable EEG headset and using VR to induce emotional responses from the users. Various researchers have conducted emotion recognition using medical-grade EEG devices supported with a 2D monitor screen to indu...

Full description

Saved in:
Bibliographic Details
Main Authors: Nazmi Sofian Suhaimi, James Mountstephens, Jason Teo
Format: Proceedings
Language:en
Published: Association for Computing Machinery 2021
Subjects:
Online Access:https://eprints.ums.edu.my/id/eprint/44934/1/FULLTEXT.pdf
https://eprints.ums.edu.my/id/eprint/44934/
https://dl.acm.org/doi/10.1145/3457784.3457809
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1840842974234148864
author Nazmi Sofian Suhaimi
James Mountstephens
Jason Teo
author_facet Nazmi Sofian Suhaimi
James Mountstephens
Jason Teo
author_sort Nazmi Sofian Suhaimi
building UMS Library
collection Institutional Repository
content_provider Universiti Malaysia Sabah
content_source UMS Institutional Repository
continent Asia
country Malaysia
description The following research describes the potential of classifying a four class emotion using a wearable EEG headset and using VR to induce emotional responses from the users. Various researchers have conducted emotion recognition using medical-grade EEG devices supported with a 2D monitor screen to induce emotional responses. This method of approach could cause additional artifacts due to the lack of concentration focusing within the border of the monitor screen of the intended stimulation thus reducing the classification accuracies. The large and complex EEG machine used by medical professions are sensitive equipment must be operated by trained professions thus making it difficult to seek permit to access such device. Hence, using a wearable EEG headset which is small and portable was considered for the brainwave signal samplings. this favors the researchers for use in conducting experiments for a human recognition system. The wearable EEG headset collects the brainwave signals at TP9, TP10, AF7, and AF8 electrode placements sampled at 256Hz with the five-bands (Delta, Theta, Alpha, Beta, Gamma). Additionally, the wearable EEG headset combines with the virtual reality (VR) headset to induce emotional responses presented to the users using the prepared VR video stimulus. The VR video was presented using the Arousal Valence Space (AVS) model with each of the respective quadrant having four videos presented in 80-seconds with a 10-second rest interval during transitions totaling up to 360-seconds from beginning to end. The collected samples are classified using Feedforward Artificial Neural Network (FANN) with 10-fold cross-validation and the model was trained using 90% of the total dataset with 10% used for validation purposes. The highest average classification result obtained from FANN was at 41.04%. While the classification performance was low, the confusion matrix presented a different view of the four-classes performed using different trained epoch values. Observations of trained epoch (2000, 3000, and 5000) showed each of the emotion classes happy, scared, bored, and calm, achieved classification accuracy of 75.15%, 75.12%, 75.02%, and 74.24% respectively
format Proceedings
id my.ums.eprints-44934
institution Universiti Malaysia Sabah
language en
publishDate 2021
publisher Association for Computing Machinery
record_format eprints
spelling my.ums.eprints-449342025-08-13T05:37:14Z https://eprints.ums.edu.my/id/eprint/44934/ Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns Nazmi Sofian Suhaimi James Mountstephens Jason Teo BF511-593 Affection. Feeling. Emotion QA71-90 Instruments and machines The following research describes the potential of classifying a four class emotion using a wearable EEG headset and using VR to induce emotional responses from the users. Various researchers have conducted emotion recognition using medical-grade EEG devices supported with a 2D monitor screen to induce emotional responses. This method of approach could cause additional artifacts due to the lack of concentration focusing within the border of the monitor screen of the intended stimulation thus reducing the classification accuracies. The large and complex EEG machine used by medical professions are sensitive equipment must be operated by trained professions thus making it difficult to seek permit to access such device. Hence, using a wearable EEG headset which is small and portable was considered for the brainwave signal samplings. this favors the researchers for use in conducting experiments for a human recognition system. The wearable EEG headset collects the brainwave signals at TP9, TP10, AF7, and AF8 electrode placements sampled at 256Hz with the five-bands (Delta, Theta, Alpha, Beta, Gamma). Additionally, the wearable EEG headset combines with the virtual reality (VR) headset to induce emotional responses presented to the users using the prepared VR video stimulus. The VR video was presented using the Arousal Valence Space (AVS) model with each of the respective quadrant having four videos presented in 80-seconds with a 10-second rest interval during transitions totaling up to 360-seconds from beginning to end. The collected samples are classified using Feedforward Artificial Neural Network (FANN) with 10-fold cross-validation and the model was trained using 90% of the total dataset with 10% used for validation purposes. The highest average classification result obtained from FANN was at 41.04%. While the classification performance was low, the confusion matrix presented a different view of the four-classes performed using different trained epoch values. Observations of trained epoch (2000, 3000, and 5000) showed each of the emotion classes happy, scared, bored, and calm, achieved classification accuracy of 75.15%, 75.12%, 75.02%, and 74.24% respectively Association for Computing Machinery 2021-07-30 Proceedings PeerReviewed text en https://eprints.ums.edu.my/id/eprint/44934/1/FULLTEXT.pdf Nazmi Sofian Suhaimi and James Mountstephens and Jason Teo (2021) Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns. https://dl.acm.org/doi/10.1145/3457784.3457809
spellingShingle BF511-593 Affection. Feeling. Emotion
QA71-90 Instruments and machines
Nazmi Sofian Suhaimi
James Mountstephens
Jason Teo
Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns
title Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns
title_full Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns
title_fullStr Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns
title_full_unstemmed Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns
title_short Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns
title_sort class-based analysis of russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward anns
topic BF511-593 Affection. Feeling. Emotion
QA71-90 Instruments and machines
url https://eprints.ums.edu.my/id/eprint/44934/1/FULLTEXT.pdf
https://eprints.ums.edu.my/id/eprint/44934/
https://dl.acm.org/doi/10.1145/3457784.3457809
url_provider http://eprints.ums.edu.my/