Skip to Main Content
Gaze estimation systems use calibration procedures to estimate subject-specific parameters that are needed for the calculation of the point-of-gaze. In these procedures, subjects are required to fixate on a specific point or points in space at specific time instances. Advanced remote gaze estimation systems can estimate the optical axis of the eye without any personal calibration procedure, but use a single calibration point to estimate the angle between the optical axis and the visual axis (line-of-gaze). This paper presents a novel calibration procedure that does not require active user participation. To estimate the angles between the optical and visual axes of each eye, this procedure minimizes the distance between the intersections of the visual axes of the left and right eyes with one or more observation surfaces (displays) while subjects look naturally at these displays (e.g., watching a video clip). Theoretical analysis and computer simulations show that the performance of the proposed procedure improves when the range of angles between the visual axes and vectors normal to the observation surfaces increases. Experiments with four subjects show that the subject-specific angles between the optical and visual axes can be estimated with an rms error of 0.5??.