By Topic

Tap-to-search: Interactive and contextual visual search on mobile devices

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Ning Zhang ; Ryerson Multimedia Res. Lab., Ryerson Univ. Toronto, Toronto, ON, Canada ; Tao Mei ; Xian-Sheng Hua ; Ling Guan
more authors

Mobile visual search has been an emerging topic for both researching and industrial communities. Among various methods, visual search has its merit in providing an alternative solution, where text and voice searches are not applicable. This paper proposes an interactive “tap-to-search” approach utilizing both individual's intention in selecting interested regions via “tap” actions on the mobile touch screen, as well as a visual recognition by search mechanism in a large-scale image database. Automatic image segmentation technique is applied in order to provide region candidates. Visual vocabulary tree based search is adopted by incorporating rich contextual information which are collected from mobile sensors. The proposed approach has been conducted on an image dataset with the scale of two million. We demonstrated that using GPS contextual information, such an approach can further achieve satisfactory results with the standard information retrieval evaluation.

Published in:

Multimedia Signal Processing (MMSP), 2011 IEEE 13th International Workshop on

Date of Conference:

17-19 Oct. 2011