By Topic

Hand gesture recognition: self-organising maps as a graphical user interface for the partitioning of large training data sets

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
G. Heidemann ; AG Neuroinformatics, Bielefeld Univ., Germany ; H. Bekel ; I. Bax ; A. Saalbach

Gesture recognition is a difficult task in computer vision due to the numerous degrees of freedom of a human hand. Fortunately, human gesture covers only a small part of the theoretical "configuration space" of a hand, so an appearance based representation of human gesture becomes tractable. A major problem, however, is the acquisition of appropriate labelled image data from which an appearance based representation can be built. In This work we apply self-organising maps for a visualisation of large amounts of segmented hands performing pointing gestures. Using a graphical interface, an easy labelling of the data set is facilitated. The labelled set is used to train a neural classification system, which is itself embedded in a larger architecture for the recognition of gestural reference to objects.

Published in:

Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on  (Volume:4 )

Date of Conference:

23-26 Aug. 2004