Multiclass emotion classification using pupil size in VR: Tuning support vector machines to improve performance

Emotion recognition and classification have become a popular topic of research among the area of computer science. In this paper, we present on the emotion classification approach using eye-tracking data solely with machine learning in Virtual Reality (VR). The emotions were classified into four dis...

Full description

Saved in:
Bibliographic Details
Main Authors: Lim Jia Zheng, James Mountstephens, Jason Teo Tze Wi
Format: Conference or Workshop Item
Language:en
Published: 2020
Subjects:
Online Access:https://eprints.ums.edu.my/id/eprint/45160/1/FULLTEXT.pdf
https://eprints.ums.edu.my/id/eprint/45160/
https://iopscience.iop.org/article/10.1088/1742-6596/1529/5/052062/meta
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Emotion recognition and classification have become a popular topic of research among the area of computer science. In this paper, we present on the emotion classification approach using eye-tracking data solely with machine learning in Virtual Reality (VR). The emotions were classified into four distinct classes according to the Circumplex Model of Affects. The emotional stimuli used for this experiment is 3600 videos presented in VR with four sessions stimulation according to the respective quadrant of emotions. Eye-tracking data is recorded using an eye-tracker and pupil diameter was chosen as a single modality feature for this investigation. The classifier used in this experiment was Support Vector Machine (SVM). The best accuracy is obtained from tuning the parameter in SVM and the best accuracy achieved was 57.65%