Skip to Main Content
Web services have become a de facto standard for achieving interoperability among business applications over the Internet. The quality of Web services may be an essential determinant when selecting Web services. From client's perspective, the Performance/Cost Ratio (PCR) is a significant parameter to measure the actual QoS level. Thus it is necessary for the client to derive the PCR when they invoke a service. Therefore a mechanism is required to evaluate the PCR to draw up plans for the clients and providers conveniently. Considering the uncertainty over the physical environments, we present a novel tactic to model Web services invocation using Markov process, which can be analyzed to determine its steady state behavior. Thereby, we can obtain the steady distribution of the usage state for each Web service. The Markov process is initialized through Bayesian learning algorithm, and can be adapted to the nondeterministic invocation behaviors. Our theoretical analysis demonstrates that the scheme is reasonable and competent for the changing environment.