Skip to Main Content
We propose a new wearable system that can estimate the 3D position of a gazed point by measuring multiple binocular view lines. In principle, 3D measurement is possible by the triangulation of binocular view lines. However, it is difficult to measure these lines accurately with a device for eye tracking, because of errors caused by (1) difficulty in calibrating the device and (2) the limitation that a human cannot gaze very accurately at a distant point. Concerning (1), the accuracy of calibration can be improved by considering the optical properties of a camera in the device. To solve (2), we propose a stochastic algorithm that determines a gazed 3D position by integrating information of view lines observed at multiple head positions. We validated the effectiveness of the proposed algorithm experimentally.