Abstract:
A novel vision-based approach for estimating individual dietary intake in food sharing scenarios is proposed in this paper, which incorporates food detection, face recogn...Show MoreMetadata
Abstract:
A novel vision-based approach for estimating individual dietary intake in food sharing scenarios is proposed in this paper, which incorporates food detection, face recognition and hand tracking techniques. The method is validated using panoramic videos which capture subjects' eating episodes. The results demonstrate that the proposed approach is able to reliably estimate food intake of each individual as well as the food eating sequence. To identify the food items ingested by the subject, a transfer learning approach is designed. 4, 200 food images with segmentation masks, among which 1,500 are newly annotated, are used to fine-tune the deep neural network for the targeted food intake application. In addition, a method for associating detected hands with subjects is developed and the outcomes of face recognition are refined to enable the quantification of individual dietary intake in communal eating settings.
Published in: 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN)
Date of Conference: 19-22 May 2019
Date Added to IEEE Xplore: 25 July 2019
ISBN Information: