Skip to Main Content
In this paper we track a personpsilas region of interest by both recovering the 3D trajectory in the indoor environment and estimating the head pose indicating the attention direction. The above two attributes of the person can be combined to provide useful information for smart home and other smart indoor environment applications. There are two main differentiations of the work in this paper. First, a nonlinear graph embedding method is used to robustly estimate the head yaw angle under 0deg ~ 360deg. in low resolution images. Second, the personpsilas trajectory is recovered in an affinely-equal manner with un-calibrated cameras. We show that under certain conditions an affine camera model can be assumed, with which both the 3D trajectory and the camera parameters can be optimally estimated simultaneously from 2D tracking. In this case the optimization problem is linear and the solution can be achieved in real-time. With this method no accurate calibration is required for the cameras, while we can still achieve the personpsilas trajectory and region of interest relative to the field confined by the rough placement of the cameras. Therefore the motivation of this work is to facilitate/enable multi-camera systems for ubiquitous computing and ambient intelligent applications such as smart homes and smart meeting rooms. In such applications accurate calibration may be difficult to obtain by the users, but with the proposed methods functionalities of interest can still be achieved.