Skip to Main Content
The outage probability is a fundamental performance metric, which has been widely investigated in the literature. In this paper, we develop a general method to compute the outage given the moment generating function (MGF). When computing the outage using the MGF, integration along the standard Bromwich contour suffers from a loss of accuracy due to oscillatory nature of the integrand. One can address this difficulty by using the Cauchy's theorem to replace the Bromwich contour by an appropriate equivalent contour. For highly-accurate numerical results, the steepest descent contour is the most suitable replacement. Unfortunately, this optimal contour can not be in general expressed in closed-form. The class of Talbot contours characterized by three parameters provides an alternative. However, it is not clear how these parameters need to be tuned for best results. We propose the use of linear regression to set the parameter values so as to minimize the mismatch between corresponding Talbot contour and the steepest-descent contour. The resulting integral has a smooth and rapidly decaying integrand, making it possible for the outage probability to be evaluated with high accuracy by using a simple numerical integration method. This approach is general in a sense that it works for any system where the MGF is known. Thus, it can handle a wide range of fading distributions and a variety of communication systems.