Abstract:
Enormous amount of data is being generated at a tremendous rate by multiple sources, often this data exists in different formats thus making it quite difficult to process...Show MoreMetadata
Abstract:
Enormous amount of data is being generated at a tremendous rate by multiple sources, often this data exists in different formats thus making it quite difficult to process the data using traditional methods. The platforms used for processing this type of data rely on distributed architecture like Cloud computing, Hadoop etc. The processing of big data can be efficiently carried out by exploring the characteristics of underlying platforms. With the advent of efficient algorithms, software metrics and by identifying the relationship amongst these measures, system characteristics can be evaluated in order to improve the overall performance of the computing system. By focusing on these measures which play important role in determining the overall performance, service level agreements can also be revised. This paper presents a survey of different performance modeling techniques of big data applications. One of the key concepts in performance modeling is finding relevant parameters which accurately represent performance of big data platforms. These extracted relevant performances measures are mapped onto software qualify concepts which are then used for defining service level agreements.
Published in: 2017 7th International Conference on Cloud Computing, Data Science & Engineering - Confluence
Date of Conference: 12-13 January 2017
Date Added to IEEE Xplore: 08 June 2017
ISBN Information: