Abstract:
The microservice architecture is being increasingly used to build complex applications. This software architecture offers scalability, modularity, agility to development ...Show MoreMetadata
Abstract:
The microservice architecture is being increasingly used to build complex applications. This software architecture offers scalability, modularity, agility to development processes. However, ensuring optimal performance prior to deployment emerges as a significant challenge, especially within the fast-paced environments of Continuous Integration/Continuous Deployment (CI/CD) pipelines. Traditional performance testing processes are often reliant on synthetic scenarios and lengthy testing processes, and can be challenging to adopt in such an environment where testing needs to be both realistic and quick. Addressing this need, our paper proposes and implements an innovative framework that leverages real-world usage traces to identify and execute a small yet essential set of performance tests. This approach aims to seamlessly integrate with CI/CD workflows, offering developers quick feedback on performance issues and scalability constraints that may arise from changes to one or more microservices. Through a series of empirical evaluations, we compare the ability of different techniques in identifying a set of performance tests that can capture historically observed system behaviour and be executed within a specified time budget. In our paper, we successfully replicated the response time distribution of a 24-hour test on our custom testbench within just a five-minute test, achieving a relative percent error of only 0.96% and 1.40% at the 95th and 99th percentile, respectively. This considerable decrease in time and resources necessary for load testing and response time modeling demonstrates the efficiency and effectiveness of our approach.
Date of Conference: 07-13 July 2024
Date Added to IEEE Xplore: 28 August 2024
ISBN Information: