Skip to Main Content
This paper presents a real time system, which includes detecting and tracking bare hand in cluttered background using skin detection and hand postures contours comparison algorithm after face subtraction, and recognizing hand gestures using Principle Components Analysis (PCA). In the training stage, a set of hand postures images with different scales, rotation and lighting conditions are trained. Then, the most eigenvectors of training images are determined, and the training weights are calculated by projecting each training image onto the most eigenvectors. In the testing stage, for every frame captured from a webcam, the hand gesture is detected using our algorithm, then the small image that contains the detected hand gesture is projected onto the most eigenvectors of training images to form its test weights. Finally, the minimum Euclidean distance is determined between the test weights and the training weights of each training image to recognize the hand gesture.