Scheduled System Maintenance:
On April 27th, single article purchases and IEEE account management will be unavailable from 2:00 PM - 4:00 PM ET (18:00 - 20:00 UTC).
We apologize for the inconvenience.
By Topic

Combining eye gaze and hand tracking for pointer control in HCI: Developing a more robust and accurate interaction system for pointer positioning and clicking

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
Chuan, N.-K. ; Product Quality & Reliability Eng. (PQRE), MIMOS Berhad, Kuala Lumpur, Malaysia ; Sivaji, A.

This paper describes a method to combine the use of eye gaze and hand tracking to provide a more robust and accurate way to control a pointer in a windows, icons, menus, and pointer (WIMP) graphic user interfaces (GUI). In the proposed interaction system, the eye gaze is used to move a rectangular overlay called `area of interaction' (AOI), on a computer screen. The AOI defines the boundary for pointer positioning. Hand fingertip detection is used to position the pointer. The presence of a second hand then triggers a click event. We tested the performance of the method against a standalone hand tracking interaction. Our system of combining the two types of tracking was found to be more accurate and experienced less error than using hand tracking method alone. The system is able to utilize the speed of eye gaze while overcoming its lack of accuracy by switching to hand tracking for more accurate pointing.

Published in:

Humanities, Science and Engineering (CHUSER), 2012 IEEE Colloquium on

Date of Conference:

3-4 Dec. 2012