Skip to Main Content
The prevalence of obesity worldwide presents a great challenge to existing healthcare systems. There is a general need for pervasive monitoring of the dietary behaviour of those who are at risk of co-morbidities. Currently, however, there is no accurate method of assessing the nutritional intake of people in their home environment. Traditional methods require subjects to manually respond to questionnaires for analysis, which is subjective, prone to errors, and difficult to ensure consistency and compliance. In this paper, we present a wearable sensor platform that autonomously provides detailed information regarding a subject's dietary habits. The sensor consists of a microphone and a camera and is worn discretely on the ear. Sound features are extracted in real-time and if a chewing activity is classified, the camera captures a video sequence for further analysis. From this sequence, a number of key frames are extracted to represent important episodes during the course of a meal. Results show a high classification rate of chewing activities, and the visual log demonstrates a detailed overview of the subject's food intake that is difficult to quantify from manually-acquired food records.