Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals
Research output: Chapter in Book/Report/Conference proceeding › Conference contribution
Authors
Colleges, School and Institutes
External organisations
- De Montfort University
Abstract
This paper presents a multimodal affect recognition system for interactive virtual gaming environments using eye tracking and speech signals, captured in gameplay scenarios that are designed to elicit controlled affective states based on the arousal and valence dimensions. The Support Vector Machine is employed as a classifier to detect these affective states from both modalities. The recognition results reveal that eye tracking is superior to speech in affect detection and that the two modalities are complementary in the application of interactive gaming. This suggests that it is feasible to design an accurate multimodal recognition system to detect players' affects from the eye tracking and speech modalities in the interactive gaming environment. We emphasise the potential of integrating the proposed multimodal system into game interfaces to enhance interaction and provide an adaptive gaming experience.
Details
Original language | English |
---|---|
Title of host publication | ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction |
Publication status | Published - 3 Nov 2017 |
Event | 19th ACM International Conference on Multimodal Interaction, ICMI 2017 - Glasgow, United Kingdom Duration: 13 Nov 2017 → 17 Nov 2017 |
Conference
Conference | 19th ACM International Conference on Multimodal Interaction, ICMI 2017 |
---|---|
Country | United Kingdom |
City | Glasgow |
Period | 13/11/17 → 17/11/17 |
Keywords
- Affects, Eye tracking, Games, Speech