Skip to Main Content
In this paper, we describe an algorithm for the synthesis of natural head motion on a 3D avatar from reconstructed 3D frontal face data with the goal of making 3D video games immersive, engaging and fun when played on 3D displays. The idea is that frame by frame natural head motion of a user is calculated without utilizing any markers from stereo image pair and transmitted to the avatar to generate realistic motion. Thus, the head motion is regenerated with the avatar in the video sequence in real-time. Obviously, the tools developed for these scenarios are applicable to other applications such as virtual space 3D teleconferencing with avatars or human-computer interface applications.