Loading [a11y]/accessibility-menu.js
Feature Selection for Zero-Shot Gesture Recognition | IEEE Conference Publication | IEEE Xplore

Feature Selection for Zero-Shot Gesture Recognition


Abstract:

Existing classification techniques assign a predetermined categorical label to each sample and cannot recognize the new categories that might appear after the training st...Show More

Abstract:

Existing classification techniques assign a predetermined categorical label to each sample and cannot recognize the new categories that might appear after the training stage. This limitation has led to the advent of new paradigms in machine learning such as zero-shot learning (ZSL). ZSL aims to recognize unseen categories by having a high-level description of them. While deep learning has pushed the limits of ZSL for object recognition, ZSL for temporal problems such as unfamiliar gesture recognition (ZSGL) remain unexplored. Previous attempts to address ZSGL were focused on the creation of gesture attributes, attribute-based datasets, and algorithmic improvements, and there is little or no research concerned with feature selection for ZSGL problems. It is indisputable that deep learning has obviated the need for feature engineering for the problems with large datasets. However, when the data is scarce, it is critical to leverage the domain information to create discriminative input features. The main goal of this work is to study the effect of three different feature extraction techniques (raw features, engineered features, and deep learning features) on the performance of ZSGL. Next, we propose a new approach for ZSGL that jointly minimizes the reconstruction loss, semantic and classification losses. Our methodology yields an unseen class accuracy of (38%) which parallels the accuracies obtained through state-of-the-art approaches.
Date of Conference: 16-20 November 2020
Date Added to IEEE Xplore: 18 January 2021
ISBN Information:
Conference Location: Buenos Aires, Argentina

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.