Affect recognition in an interactive gaming environment using eye tracking

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Standard

Affect recognition in an interactive gaming environment using eye tracking. / Alhargan, Ashwaq; Cooke, Neil; Binjammaz, Tareq.

2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017. Vol. 2018-January Institute of Electrical and Electronics Engineers (IEEE), 2018. p. 285-291.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Alhargan, A, Cooke, N & Binjammaz, T 2018, Affect recognition in an interactive gaming environment using eye tracking. in 2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017. vol. 2018-January, Institute of Electrical and Electronics Engineers (IEEE), pp. 285-291, 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017, San Antonio, United States, 23/10/17. https://doi.org/10.1109/ACII.2017.8273614

APA

Alhargan, A., Cooke, N., & Binjammaz, T. (2018). Affect recognition in an interactive gaming environment using eye tracking. In 2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017 (Vol. 2018-January, pp. 285-291). Institute of Electrical and Electronics Engineers (IEEE). https://doi.org/10.1109/ACII.2017.8273614

Vancouver

Alhargan A, Cooke N, Binjammaz T. Affect recognition in an interactive gaming environment using eye tracking. In 2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017. Vol. 2018-January. Institute of Electrical and Electronics Engineers (IEEE). 2018. p. 285-291 https://doi.org/10.1109/ACII.2017.8273614

Author

Alhargan, Ashwaq ; Cooke, Neil ; Binjammaz, Tareq. / Affect recognition in an interactive gaming environment using eye tracking. 2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017. Vol. 2018-January Institute of Electrical and Electronics Engineers (IEEE), 2018. pp. 285-291

Bibtex

@inproceedings{8551b004660d4372ac4e5d7eedded1c1,
title = "Affect recognition in an interactive gaming environment using eye tracking",
abstract = "Eye tracking is more accurate and less intrusive method to detect the affective state than other current physiological sensing methods. This paper aims to investigate the degree to which this affective state is automatically recognised from eye tracking signals, namely pupillary responses. We tracked fourteen players while they interacted with a virtual gaming environment, designed to elicit different affective states corresponding to the arousal and valence dimensions of Russell's circumplex model. Pupillary response features with luminosity compensation is used for automatic affect recognition. Moreover, pupillary response features based on the Hilbert transform are proposed to improve the recognition performance. The results demonstrated that the recognition of a player's affective state on both dimensions were successful with pupillary response features and using the Hilbert transform improve the recognition performance up to 76.0% and 61.4% on arousal and valence, respectively. This highlights the potential of affect-aware gaming interfaces based on pupillary responses sensing.",
author = "Ashwaq Alhargan and Neil Cooke and Tareq Binjammaz",
year = "2018",
month = feb,
day = "1",
doi = "10.1109/ACII.2017.8273614",
language = "English",
volume = "2018-January",
pages = "285--291",
booktitle = "2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017",
publisher = "Institute of Electrical and Electronics Engineers (IEEE)",
note = "7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017 ; Conference date: 23-10-2017 Through 26-10-2017",

}

RIS

TY - GEN

T1 - Affect recognition in an interactive gaming environment using eye tracking

AU - Alhargan, Ashwaq

AU - Cooke, Neil

AU - Binjammaz, Tareq

PY - 2018/2/1

Y1 - 2018/2/1

N2 - Eye tracking is more accurate and less intrusive method to detect the affective state than other current physiological sensing methods. This paper aims to investigate the degree to which this affective state is automatically recognised from eye tracking signals, namely pupillary responses. We tracked fourteen players while they interacted with a virtual gaming environment, designed to elicit different affective states corresponding to the arousal and valence dimensions of Russell's circumplex model. Pupillary response features with luminosity compensation is used for automatic affect recognition. Moreover, pupillary response features based on the Hilbert transform are proposed to improve the recognition performance. The results demonstrated that the recognition of a player's affective state on both dimensions were successful with pupillary response features and using the Hilbert transform improve the recognition performance up to 76.0% and 61.4% on arousal and valence, respectively. This highlights the potential of affect-aware gaming interfaces based on pupillary responses sensing.

AB - Eye tracking is more accurate and less intrusive method to detect the affective state than other current physiological sensing methods. This paper aims to investigate the degree to which this affective state is automatically recognised from eye tracking signals, namely pupillary responses. We tracked fourteen players while they interacted with a virtual gaming environment, designed to elicit different affective states corresponding to the arousal and valence dimensions of Russell's circumplex model. Pupillary response features with luminosity compensation is used for automatic affect recognition. Moreover, pupillary response features based on the Hilbert transform are proposed to improve the recognition performance. The results demonstrated that the recognition of a player's affective state on both dimensions were successful with pupillary response features and using the Hilbert transform improve the recognition performance up to 76.0% and 61.4% on arousal and valence, respectively. This highlights the potential of affect-aware gaming interfaces based on pupillary responses sensing.

UR - http://www.scopus.com/inward/record.url?scp=85046752280&partnerID=8YFLogxK

U2 - 10.1109/ACII.2017.8273614

DO - 10.1109/ACII.2017.8273614

M3 - Conference contribution

AN - SCOPUS:85046752280

VL - 2018-January

SP - 285

EP - 291

BT - 2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017

PB - Institute of Electrical and Electronics Engineers (IEEE)

T2 - 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017

Y2 - 23 October 2017 through 26 October 2017

ER -