Abstract
This paper presents a multimodal affect recognition system for interactive virtual gaming environments using eye tracking and speech signals, captured in gameplay scenarios that are designed to elicit controlled affective states based on the arousal and valence dimensions. The Support Vector Machine is employed as a classifier to detect these affective states from both modalities. The recognition results reveal that eye tracking is superior to speech in affect detection and that the two modalities are complementary in the application of interactive gaming. This suggests that it is feasible to design an accurate multimodal recognition system to detect players' affects from the eye tracking and speech modalities in the interactive gaming environment. We emphasise the potential of integrating the proposed multimodal system into game interfaces to enhance interaction and provide an adaptive gaming experience.
Original language | English |
---|---|
Title of host publication | ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction |
Place of Publication | New York |
Publisher | Association for Computing Machinery |
Pages | 479-486 |
Number of pages | 8 |
Volume | 2017-January |
ISBN (Electronic) | 9781450355438 |
DOIs | |
Publication status | Published - 3 Nov 2017 |
Event | 19th ACM International Conference on Multimodal Interaction, ICMI 2017 - Glasgow, United Kingdom Duration: 13 Nov 2017 → 17 Nov 2017 |
Conference
Conference | 19th ACM International Conference on Multimodal Interaction, ICMI 2017 |
---|---|
Country/Territory | United Kingdom |
City | Glasgow |
Period | 13/11/17 → 17/11/17 |
Keywords
- Affects
- Eye tracking
- Games
- Speech
ASJC Scopus subject areas
- Human-Computer Interaction
- Computer Science Applications
- Computer Vision and Pattern Recognition
- Hardware and Architecture