Affect recognition in an interactive gaming environment using eye tracking

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Authors

Colleges, School and Institutes

External organisations

  • EESE Department
  • De Montfort University

Abstract

Eye tracking is more accurate and less intrusive method to detect the affective state than other current physiological sensing methods. This paper aims to investigate the degree to which this affective state is automatically recognised from eye tracking signals, namely pupillary responses. We tracked fourteen players while they interacted with a virtual gaming environment, designed to elicit different affective states corresponding to the arousal and valence dimensions of Russell's circumplex model. Pupillary response features with luminosity compensation is used for automatic affect recognition. Moreover, pupillary response features based on the Hilbert transform are proposed to improve the recognition performance. The results demonstrated that the recognition of a player's affective state on both dimensions were successful with pupillary response features and using the Hilbert transform improve the recognition performance up to 76.0% and 61.4% on arousal and valence, respectively. This highlights the potential of affect-aware gaming interfaces based on pupillary responses sensing.

Details

Original languageEnglish
Title of host publication2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017
Publication statusE-pub ahead of print - 1 Feb 2018
Event7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017 - San Antonio, United States
Duration: 23 Oct 201726 Oct 2017

Conference

Conference7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017
CountryUnited States
CitySan Antonio
Period23/10/1726/10/17