Alex Fraser, University of Oxford
As online research has become more prevalent, researchers are investigating the possibility of replicating techniques that go beyond simple behavioural measurements. One method that has captured the imagination of researchers is leveraging the webcam to collect eye-tracking data. Several packages have been developed for collecting such data but have significant limitations due to extensive and potentially frustrating calibration procedures. Unfortunately, this can limit the accessibility of these packages when collecting data with specific populations, such as children and participants with neuro-developmental difficulties. To overcome this, we have looked at how gaze detection studies are conducted with infants, where researchers will manually score gaze direction from videos to minimise data loss.
Using these methods, we have developed GazeScorer, an automated gaze scoring package that can distinguish a Left, Right, and Central gaze location using basic image processing. Using videos collected through a Gorilla-hosted experiment, we have demonstrated a good level of inter-rater reliability between GazeScorer and a manual scorer. This opens the possibility of a hybrid-scoring system with minimal manual intervention in the short term. Future development will focus on utilising live webcam footage for data collection through the browser. This software would provide a potential resource for researchers who would benefit from gaze-based responses, but do not require high spatial resolution.