Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Standard

Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals. / Alhargan, Ashwaq; Cooke, Neil; Binjammaz, Tareq.

ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction. Vol. 2017-January New York : Association for Computing Machinery , 2017. p. 479-486.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Alhargan, A, Cooke, N & Binjammaz, T 2017, Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals. in ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction. vol. 2017-January, Association for Computing Machinery , New York, pp. 479-486, 19th ACM International Conference on Multimodal Interaction, ICMI 2017, Glasgow, United Kingdom, 13/11/17. https://doi.org/10.1145/3136755.3137016

APA

Alhargan, A., Cooke, N., & Binjammaz, T. (2017). Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals. In ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction (Vol. 2017-January, pp. 479-486). Association for Computing Machinery . https://doi.org/10.1145/3136755.3137016

Vancouver

Alhargan A, Cooke N, Binjammaz T. Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals. In ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction. Vol. 2017-January. New York: Association for Computing Machinery . 2017. p. 479-486 https://doi.org/10.1145/3136755.3137016

Author

Alhargan, Ashwaq ; Cooke, Neil ; Binjammaz, Tareq. / Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals. ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction. Vol. 2017-January New York : Association for Computing Machinery , 2017. pp. 479-486

Bibtex

@inproceedings{2637b5e57eb34923a1b5acada120ccf5,
title = "Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals",
abstract = "This paper presents a multimodal affect recognition system for interactive virtual gaming environments using eye tracking and speech signals, captured in gameplay scenarios that are designed to elicit controlled affective states based on the arousal and valence dimensions. The Support Vector Machine is employed as a classifier to detect these affective states from both modalities. The recognition results reveal that eye tracking is superior to speech in affect detection and that the two modalities are complementary in the application of interactive gaming. This suggests that it is feasible to design an accurate multimodal recognition system to detect players' affects from the eye tracking and speech modalities in the interactive gaming environment. We emphasise the potential of integrating the proposed multimodal system into game interfaces to enhance interaction and provide an adaptive gaming experience.",
keywords = "Affects, Eye tracking, Games, Speech",
author = "Ashwaq Alhargan and Neil Cooke and Tareq Binjammaz",
year = "2017",
month = nov,
day = "3",
doi = "10.1145/3136755.3137016",
language = "English",
volume = "2017-January",
pages = "479--486",
booktitle = "ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction",
publisher = "Association for Computing Machinery ",
note = "19th ACM International Conference on Multimodal Interaction, ICMI 2017 ; Conference date: 13-11-2017 Through 17-11-2017",

}

RIS

TY - GEN

T1 - Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals

AU - Alhargan, Ashwaq

AU - Cooke, Neil

AU - Binjammaz, Tareq

PY - 2017/11/3

Y1 - 2017/11/3

N2 - This paper presents a multimodal affect recognition system for interactive virtual gaming environments using eye tracking and speech signals, captured in gameplay scenarios that are designed to elicit controlled affective states based on the arousal and valence dimensions. The Support Vector Machine is employed as a classifier to detect these affective states from both modalities. The recognition results reveal that eye tracking is superior to speech in affect detection and that the two modalities are complementary in the application of interactive gaming. This suggests that it is feasible to design an accurate multimodal recognition system to detect players' affects from the eye tracking and speech modalities in the interactive gaming environment. We emphasise the potential of integrating the proposed multimodal system into game interfaces to enhance interaction and provide an adaptive gaming experience.

AB - This paper presents a multimodal affect recognition system for interactive virtual gaming environments using eye tracking and speech signals, captured in gameplay scenarios that are designed to elicit controlled affective states based on the arousal and valence dimensions. The Support Vector Machine is employed as a classifier to detect these affective states from both modalities. The recognition results reveal that eye tracking is superior to speech in affect detection and that the two modalities are complementary in the application of interactive gaming. This suggests that it is feasible to design an accurate multimodal recognition system to detect players' affects from the eye tracking and speech modalities in the interactive gaming environment. We emphasise the potential of integrating the proposed multimodal system into game interfaces to enhance interaction and provide an adaptive gaming experience.

KW - Affects

KW - Eye tracking

KW - Games

KW - Speech

UR - http://www.scopus.com/inward/record.url?scp=85046760293&partnerID=8YFLogxK

U2 - 10.1145/3136755.3137016

DO - 10.1145/3136755.3137016

M3 - Conference contribution

AN - SCOPUS:85046760293

VL - 2017-January

SP - 479

EP - 486

BT - ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction

PB - Association for Computing Machinery

CY - New York

T2 - 19th ACM International Conference on Multimodal Interaction, ICMI 2017

Y2 - 13 November 2017 through 17 November 2017

ER -