Abstract:
Trying clothes in clothing stores is usually a time-consuming activity. Besides, it might not even be possible to try-on clothes in such cases as online shopping. Our mot...Show MoreMetadata
Abstract:
Trying clothes in clothing stores is usually a time-consuming activity. Besides, it might not even be possible to try-on clothes in such cases as online shopping. Our motivation here is to increase the time efficiency and improve the accessibility of clothes try on by creating a virtual dressing room environment. In this work, we introduce a virtual dressing room application using Microsoft Kinect sensor. Our proposed approach is mainly based on extraction of the user from the video stream, alignment of models and skin color detection. We use the modules for locations of the joints for positioning, scaling and rotation in order to align the 2D cloth models with the user. Then, we apply skin color detection on video to handle the unwanted occlusions of the user and the model. Finally, the model is superimposed on the user in real time. The problem is simply the alignment of the user and the cloth models with accurate position, scale, rotation and ordering. First, detection of the user and the body parts is one of the main steps of the problem. In literature, several approaches are proposed for body part detection, skeletal tracking and posture estimation, and superimposing it onto a virtual environment in the user interface. The project is implemented in C# programming environment for real time, Kinect hacking application. Kinect driver's middleware are used for various fundamental functions and for the tracking process in combination with Microsoft Kinect.
Published in: 2019 IEEE Jordan International Joint Conference on Electrical Engineering and Information Technology (JEEIT)
Date of Conference: 09-11 April 2019
Date Added to IEEE Xplore: 20 May 2019
ISBN Information: