Skip to Main Content
The exquisite human ability to perceive facial features has been explained by the activity of neurons particularly responsive to faces, found in the fusiform gyrus and the anterior part of the superior temporal sulcus. This study hypothesizes and demonstrates that it is possible to automatically discriminate face processing from processing of a simple control stimulus based on processed EEGs in an online fashion with high temporal resolution using measures of statistical dependence applied on steady-state visual evoked potentials. Correlation, mutual information, and a novel measure of association, referred to as generalized measure of association (GMA), were applied on filtered current source density data. Dependences between channel locations were assessed for two separate conditions elicited by distinct pictures (a face and a Gabor grating) flickering at a rate of 17.5 Hz. Filter settings were chosen to minimize the distortion produced by bandpassing parameters on dependence estimation. Statistical analysis was performed for automated stimulus classification using the Kolmogorov-Smirnov test. Results show active regions in the occipito-parietal part of the brain for both conditions with a greater dependence between occipital and inferotemporal sites for the face stimulus. GMA achieved a higher performance in discriminating the two conditions. Because no additional face-like stimuli were examined, this study established a basic difference between one particular face and one nonface stimulus. Future work may use additional stimuli and experimental manipulations to determine the specificity of the current connectivity results.