Paper: EyeSee3D: A Low-Cost Approach for Analyzing Mobile 3D Eye Tracking Data Using Computer Vision and Augmented Reality Technology

Our first paper on the EyeSee3D approach has been presented at the 2014 Symposium on Eye-Tracking Research and Application in Florida.

EyeSee3D Version 1.0

Automatic gaze analysis using tracking of fiducial markers in the scene-camera video.

Paper Abstract

For validly analyzing human visual attention, it is often necessary to proceed from computer-based desktop set-ups to more natural real-world settings. However, the resulting loss of control has to be counterbalanced by increasing participant and/or item count. Together with the e ort required to manually annotate the gaze-cursor videos recorded with mobile eye trackers, this renders many studies unfeasible.

We tackle this issue by minimizing the need for manual annotation of mobile gaze data. Our approach combines geometric modelling with inexpensive 3D marker tracking to align virtual proxies with the real-world objects. This allows us to classify xations on objects of interest automatically while supporting a completely free moving participant.

The paper presents the EyeSee3D method as well as a comparison of an expensive outside-in (external cameras) and a low-cost inside-out (scene camera) tracking of the eye-tracker’s position. The EyeSee3D approach is evaluated comparing the results from automatic and manual classifi cation of xation targets, which raises old problems of annotation validity in a modern context.

Citation:

  • Pfeiffer, T., & Renner, P. (2014). EyeSee3D: a low-cost approach for analysing mobile 3D eye tracking data using augmented reality technology. Proceedings of the Symposium on Eye Tracking Research and Applications, 195-202.

Video:

Leave a comment