Loading [MathJax]/extensions/MathMenu.js
Eyemotion: Classifying Facial Expressions in VR Using Eye-Tracking Cameras | IEEE Conference Publication | IEEE Xplore

Eyemotion: Classifying Facial Expressions in VR Using Eye-Tracking Cameras


Abstract:

One of the main challenges of social interaction in virtual reality settings is that head-mounted displays occlude a large portion of the face, blocking facial expression...Show More

Abstract:

One of the main challenges of social interaction in virtual reality settings is that head-mounted displays occlude a large portion of the face, blocking facial expressions and thereby restricting social engagement cues among users. We present an algorithm to automatically infer expressions by analyzing only a partially occluded face while the user is engaged in a virtual reality experience. Specifically, we show that images of the user's eyes captured from an IR gaze-tracking camera within a VR headset are sufficient to infer a subset of facial expressions without the use of any fixed external camera. Using these inferences, we can generate dynamic avatars in real-time which function as an expressive surrogate for the user. We propose a novel data collection pipeline as well as a novel approach for increasing CNN accuracy via personalization. Our results show a mean accuracy of 74% (F1 of 0.73) among 5 'emotive' expressions and a mean accuracy of 70% (F1 of 0.68) among 10 distinct facial action units, outperforming human raters.
Date of Conference: 07-11 January 2019
Date Added to IEEE Xplore: 07 March 2019
ISBN Information:
Print on Demand(PoD) ISSN: 1550-5790
Conference Location: Waikoloa, HI, USA

Contact IEEE to Subscribe

References

References is not available for this document.