By Topic

Semantic annotation and retrieval of unedited video based on extraction of 3D camera motion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Waizenegger, W. ; Fraunhofer Inst. for Telecommun., Berlin ; Feldmann, I. ; Schreer, O.

In this paper, we propose an approach towards high-level semantic annotation of 3D camera motion information in videos. For video search and retrieval, query by example is a common technique, which proved its capability even for image features. However, complex properties of video clips even in the temporal domain can hardly be described via examples as they often fail to capture the relevant information. Hence, high-level semantic information seems much more appropriate for future video search and retrieval engines fully based on user-friendly interfaces. Well known self-calibration techniques provide robust and reliable information on camera motion, but this rich source of information is not well exploited for high-level semantic video annotation, search and retrieval. The full temporal information along the sequence is evaluated and exploited for the annotation of meaningful high-level semantics such as turn, rotation, fast moving camera, transversal motion and so on. We present a first approach to derive generalmotion properties of the camera motion within video segments and show the potential for further investigations.

Published in:

Content-Based Multimedia Indexing, 2008. CBMI 2008. International Workshop on

Date of Conference:

18-20 June 2008