Skip to Main Content
Owing to the demand of more efficient and friendly human-computer interfaces, the researches on face processing have been rapidly grown in recent years. In addition to offering some kinds of service for human being, one of the most important characteristics of a favorable system is to autonomously interact with people. In view of the above-mentioned facts, an automatic real-time face tracking system installed on a person following robot is presented in this paper. In the face tracking procedure, we employ an improved particle filter to dynamically locate a human face. Since we have considered the hair color information of a human head, the particle filter will keep tracking even if the person is back to the sight of a camera. We further adopt both the motion and color cues as the features to alleviate the influence of the background as low as possible. According to the position of the human face in an image, we issue a series of commands (moving forward, turning left or turning right) to drive the motors of wheels on a robot, and judge the distance between the robot and a person with the aid of three ultrasonic sensors to issue a set of commands (stop or turn backward) until the robot follows to a suitable distance from the person. Experimental results reveal that the face tracking rate is more than 95% in general situations and over 88% when the face suffers from temporal occlusion. Besides this, the efficiency of system execution is very satisfactory.