Projects per year
Abstract
With continued advancements in portable eye-tracker technology liberating experimenters from the restraints of artificial
laboratory designs, research can now collect gaze data from real-world, natural navigation. However, the field lacks a robust
method for achieving this, as past approaches relied upon the time-consuming manual annotation of eye-tracking data, while
previous attempts at automation lack the necessary versatility for in-the-wild navigation trials consisting of complex and
dynamic scenes. Here, we propose a system capable of informing researchers of where and what a user’s gaze is focused
upon at any one time. The system achieves this by first running footage recorded on a head-mounted camera through a deep-
learning-based object detection algorithm called Masked Region-based Convolutional Neural Network (Mask R-CNN). The
algorithm’s output is combined with frame-by-frame gaze coordinates measured by an eye-tracking device synchronized with
the head-mounted camera to detect and annotate, without any manual intervention, what a user looked at for each frame of
the provided footage. The effectiveness of the presented methodology was legitimized by a comparison between the system
output and that of manual coders. High levels of agreement between the two validated the system as a preferable data col-
lection technique as it was capable of processing data at a significantly faster rate than its human counterpart. Support for
the system’s practicality was then further demonstrated via a case study exploring the mediatory effects of gaze behaviors
on an environment-driven attentional bias.
laboratory designs, research can now collect gaze data from real-world, natural navigation. However, the field lacks a robust
method for achieving this, as past approaches relied upon the time-consuming manual annotation of eye-tracking data, while
previous attempts at automation lack the necessary versatility for in-the-wild navigation trials consisting of complex and
dynamic scenes. Here, we propose a system capable of informing researchers of where and what a user’s gaze is focused
upon at any one time. The system achieves this by first running footage recorded on a head-mounted camera through a deep-
learning-based object detection algorithm called Masked Region-based Convolutional Neural Network (Mask R-CNN). The
algorithm’s output is combined with frame-by-frame gaze coordinates measured by an eye-tracking device synchronized with
the head-mounted camera to detect and annotate, without any manual intervention, what a user looked at for each frame of
the provided footage. The effectiveness of the presented methodology was legitimized by a comparison between the system
output and that of manual coders. High levels of agreement between the two validated the system as a preferable data col-
lection technique as it was capable of processing data at a significantly faster rate than its human counterpart. Support for
the system’s practicality was then further demonstrated via a case study exploring the mediatory effects of gaze behaviors
on an environment-driven attentional bias.
Original language | English |
---|---|
Number of pages | 20 |
Journal | Behavior Research Methods |
Early online date | 1 Jun 2022 |
DOIs | |
Publication status | Published - 1 Jun 2022 |
Bibliographical note
Funding Information:This work was financially supported by Biotechnology and Biological Sciences Research Council (BB/S003762/1) to S.-H.Y.
Publisher Copyright:
© 2022, The Author(s).
Keywords
- Gaze tracking
- Portable eye-tracker
- Object detection
- Deep learning
- Masked region-based convolutional neural network
Fingerprint
Dive into the research topics of 'Deep‑SAGA: a deep‑learning‑based system for automatic gaze annotation from eye‑tracking data'. Together they form a unique fingerprint.Projects
- 1 Active
-
Active spring muscle model - a new phenomenological model of skeletal muscle mechanics
Yeo, S.-H. (Principal Investigator)
Biotechnology & Biological Sciences Research Council
8/04/19 → 23/12/25
Project: Research Councils