By Topic

Efficient strategies for many-task frequent pattern mining in cloud computing environments

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Lin, K.W. ; Dept. of Comput. Sci. & Inf. Eng., Nat. Kaohsiung Univ. of Appl. Sci., Kaohsiung, Taiwan ; Yu-Chin Luo

The goal of data mining is to discover the hidden useful information from large databases. Mining frequent patterns from transaction databases is an important problem in data mining field. As the size of database increases, the computation time and the required memory increase severely. Parallel and distributed computing techniques have attracted extensive attentions on the ability to manage and compute the significant amount of data in the past decades. The difficulty of mining large database launched the research of designing parallel and distributed algorithms to solve the problem. However, most of the past studies did not focus on the many-task issue that is very important, especially in cloud computing environments. In cloud computing environments, application is provided as service like Google search engine, meaning that it will be used by many users at the same time. In this paper, we propose a set of strategies for many-task frequent pattern mining. Through empirical evaluations on various simulation conditions, the proposed strategies deliver excellent performance in terms of execution time.

Published in:

Systems Man and Cybernetics (SMC), 2010 IEEE International Conference on

Date of Conference:

10-13 Oct. 2010