Skip to Main Content
Recently, the information potential (IP) of order α, defined as the argument of the log in the α -order Renyi entropy, has been successfully used as an information theoretic criterion for supervised adaptive system training. In this paper, we use the survival function (or equivalently the distribution function) of an absolute value transformed random variable to define a new information potential, named the survival information potential (SIP). Compared with the IP, the SIP has some advantages, such as validity in a wide range of distributions, robustness, and the simplicity in computation. The properties of SIP and a simple formula for computing the empirical SIP are given in the paper. Finally, the SIP criterion is applied in adaptive system training, and simulation examples on FIR adaptive filtering, kernel adaptive filtering, and time delay neural networks (TDNNs) training are presented to demonstrate the performance.