Skip to Main Content
In human-robot interaction, it is important for the robot to know the head movement, gaze direction and expression of the conversation partner, since such information are deeply related with the attention, intention and emotion. Recently, many types of real-time measurement systems for head pose and gaze direction have been proposed and utilized for human interfaces and ergonomics applications. We have proposed a face measurement system which is non-contact and passive by utilizing a stereo camera pair. However the necessity of the use of stereo camera has several drawbacks. To cope with this problem, this paper proposes a method to measure the 6 DOF motions of the head by utilizing a single camera. The use of 3D face model enables us to predict six 2D motion vector fields of facial features which correspond to six unit motions in translation and rotation. An actual 2D motion vector field obtained by tracking facial features is then resolved into a linear combination of the predicted six motion vector fields. The estimated 6 DOF motions can be obtained as the six coefficients of the linear combination. Through the evaluation of the implemented motion estimation method, it was shown that the accuracy of the proposed estimation method using a single camera was comparable with that of the conventional method using a stereo camera pair. Finally developed face tracking system is demonstrated by a humanoid robot interacting with two persons, which takes advantage of the wide field of view of the camera system due to the unnecessity of the stereoscopy.