Skip to Main Content
A smart user-interface for mobile consumer devices was developed using a robust eye-gaze system without any hand motion. Using one camera and one display already available in popular mobile devices, the eye-gaze system estimates the visual angle, which shows the area of interest on the display to indicate the position of the cursor. Three novel techniques were developed to make the system robust, userindependent, and head/device motion invariant. First, by carefully investigating the geometric relation between the device and the user's cornea, a new algorithm was developed to estimate the cornea center position, which is directly related to the optical axis of the eye. Unlike previous algorithms, it does not utilize the user-dependent cornea radius. Second, to make the system robust for practical application, an algorithm was developed to compensate for imaging position errors due to the finite camera resolution. Third, a binocular algorithm was developed to estimate the user-dependent angular offsets between the optical and visual axes with only single point calibration. The proposed system was demonstrated to be accurate enough for many practical mobile user interfaces.