Skip to Main Content
In recent years, there has been a growing interest in quantifying the interaction and integration between different neuronal activities in the brain. One problem of interest has been to quantify how different neuronal sites communicate with each other. For this purpose, different measures of functional integration such as spectral coherence, phase synchrony and mutual information have been proposed. In this paper, we introduce information-theoretic measures such as entropy and divergence to quantify the interaction between different neuronal sites. The information- theoretic measures introduced in this paper are adapted to the time-frequency domain to account for the dynamic nature of neuronal activity. Time-frequency distributions are two-dimensional energy density functions of time and frequency, and can be treated in a way similar to probability density functions. Since time-frequency distributions are not always positive, information measures such as Renyi entropy and Jensen-Renyi divergence are adapted to this new domain instead of the well-known Shannon entropy. In this paper, we first discuss some properties of these modified measures and then illustrate their application to neural signals. The proposed measures are applied to multiple electrode recordings of electroencephalogram (EEG) data to quantify the interaction between different neuronal sites and between different cognitive states.