Skip to Main Content
Increasing amount of data and demand to process and analyze them induces enterprises to employ alternative computing paradigms to overcome computing capacity shortages. Cloud Computing is a new emerged computing approach that promises scalability of resources, on-demand availability and pay-as-you-go economic model instead of heavy investment on IT resources. However, to decide on choosing Cloud as computing platform, all aspects such as cost and performance trade-offs should be considered. In this paper, we conduct an experimental research to discover empirical challenges of developing a data-intensive application and compare realistic costs of Cloud development versus using local servers. We also discuss our method of calculating Cloud and local resources costs, analyze obtained experimental results and different issues of exploiting Cloud for different types of applications in two major interactive and batch categories.