Affective Recognition in Dynamic and Interactive Virtual Environments

Research output: Contribution to journalArticle

Authors

Colleges, School and Institutes

Abstract

The past decade has witnessed a significant increase in interest in human emotional behaviours in the future of interactive multimodal computing. Although much consideration has been given to non-interactive affective stimuli (e.g. images and videos), the recognition of emotions within interactive virtual environments has not received an equal level of attention. In the present study, a psychophysiological database, cataloguing the EEG,GSR and heart rate of 30 participants, exposed to an affective virtual environment, has been constructed. 743 features were extracted from the physiological signals. Then, by employing a feature selection technique, the dimensionality of the feature space was reduced to a smaller subset, containing only 30 features. Four classification
techniques (KNN, SVM, Discriminant Analysis (DA) and Classification Tree) were employed to classify the affective psychophysiological database into four Affective Clusters (derived from a Valence-Arousal space) and eight Emotion Labels. By employing cross-validation techniques, the performances of more than a quarter of a million different classification settings (various window lengths, classifier settings, etc.) were investigated. The results suggested that the physiological signals could be employed to classify emotional experiences, with high precision. The KNN and SVM outperformed both Classification Tree and DA classifiers; with 97.01% and 92.84% mean accuracies, respectively.

Details

Original languageEnglish
JournalIEEE Transactions on Affective Computing
Early online date23 Oct 2017
Publication statusE-pub ahead of print - 23 Oct 2017

Keywords

  • Virtual Reality, Affective Computing, Emotion-Based Affective Physiological Database, Affective VR

ASJC Scopus subject areas