Skip to Main Content
Stochastic processes within the general Poisson family, but distinguished by having rates which are changed by events, rather than by time, accurately describe the software error detection process. Several promising new models of this general type have been developed by the author and apply to hardware and software reliability problems, especially during the bum-in and wear-out phases. The growth models which describe wear-out apply to software when, because of pervasive patching, the system inexorably degrades. Failure rates which are decreased at the occurrence of an event (such as an error detection and removal in the case of software debugging) by either a fixed amount, or to a fraction of the most recent rate are reviewed, and new applications are described or suggested. Additional related models in which the rates increase are described and some possible applications are suggested.