Skip to Main Content
A measure of simplified dependency is introduced representing Markovian characteristics based on Shannon's entropy and conditional entropy under the Gaussian assumption. It is considered to be the most concise measure for expressing the higher order statistical properties of a time series and, in this regard, to be superior to a correlation or spectral measure. Simplified dependency is shown to be closely related to the prediction error in the autoregressive analysis of a time series and to be applicable also to non-Gaussian processes. Both the truncation method of distribution and the ensemble dependency analysis are informative for clarifying the statistical characteristics of interval sequence of a skewed distribution in a heterogeneous time series. These techniques serve to clarify the neural modulation mechanism.