By Topic

The contrast between motion and appearance representation of STIP in human action classification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Hong-Bo Zhang ; Dept. of Cognitive Sci., Xiamen Univ., Xiamen, China ; Shang-An Li ; Xian-Ming Lin ; Bi-Xia Liu

In human action classification, many effective methods are based on the detection and representation of the Space-Time Interesting Point (STIP). There are two widely used representations of STIP feature: appearance and motion description. But the appearance representation is limited by inter-class variations. In the paper the experiments are designed to demonstrate the effect of the motion description in human action classification by comparing with appearance description. In the experiment, the KTH datasets are considered. HOG and HOF features are the typical appearance and motion representation. Based on this, there are two comparisons about the STIP description in human action classification in the paper. One is directly using the SVM classifier for STIP feature with different description and using STIP categories to vote the action category. The other is using the motion description instead of the origin description in the state of the art method. All of the results show that the motion representation on STIP is better than the appearance representation in human action classification.

Published in:

Computer Science and Automation Engineering (CSAE), 2011 IEEE International Conference on  (Volume:4 )

Date of Conference:

10-12 June 2011