By Topic

Hadoop plugin for distributed and parallel image processing

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
İlginç Demir ; TÜBİTAK, Bilişim Teknolojileri Enstitüsü, Turkey ; Ahmet Sayar

Hadoop Distributed File System (HDFS) is widely used in large-scale data storage and processing. HDFS uses MapReduce programming model for parallel processing. The work presented in this paper proposes a novel Hadoop plugin to process image files with MapReduce model. The plugin introduces image related I/O formats and novel classes for creating records from input files. HDFS is especially designed to work with small number of large size files. Therefore, the proposed technique is based on merging multiple small size files into one large file to prevent the performance loss stemming from working with large number of small size files. In that way, each task becomes capable of processing multiple images in a single run cycle. The effectiveness of the proposed technique is proven by an application scenario for face detection on distributed image files.

Published in:

2012 20th Signal Processing and Communications Applications Conference (SIU)

Date of Conference:

18-20 April 2012