Skip to Main Content
Smartphones with diverse sensing capabilities are becoming widely available and pervasive in use. With the phone becoming a mobile personal computer, integrated applications can use multi-sensory data to derive information about the user's actions and the context in which these actions occur. This paper develops a novel method to assess daily living patterns using a smartphone equipped with microphones and inertial sensors. We develop a feature-space combination approach for fusion of information from sensors sampled at different rates and present a computationally light-weight algorithm to identify various high level activities. Preliminary results from an initial deployment among eight users indicate the potential for accurate, context-aware, and personalized sensing.