By Topic

Combining sparse and dense descriptors with temporal semantic structures for robust human action recognition

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Jie Chen ; Dept. of Comput. Sci. & Eng., Univ. of Oulu, Oulu, Finland ; Guoying Zhao ; Kellokumpu, V. ; Pietikainen, M.

Automatic categorization of human actions in the real world is very challenging due to the great intra-class differences. In this paper, we present a new method for robust recognition of human actions. We first cluster each video in the training set into temporal semantic segments by a dense descriptor. Each segment in the training set is represented by a concatenated histogram of sparse and dense descriptors. These histograms of segments are used to train a classifier. In the recognition stage, a query video is also divided into temporal semantic segments by clustering. Each segment will obtain a confidence evaluated by the trained classifier. Combining the confidence of each segment, we classify this query video. To evaluate our approach, we perform experiments on two challenging datasets, i.e., the Olympic Sports Dataset (OSD) and Hollywood Human Action dataset (HOHA). We also test our method on the benchmark KTH human action dataset. Experimental results confirm that our algorithm performs better than the state-of-the-art methods.

Published in:

Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on

Date of Conference:

6-13 Nov. 2011