Skip to Main Content
Corticomuscular coupling analysis based on multiple data sets such as electroencephalography (EEG) and electromyography (EMG) signals provides a useful tool for understanding human motor control systems. Two probably most popular methods are the pair-wise magnitude-squared coherence (MSC) between EEG and simultaneously-recorded EMG signals, and partial least square (PLS). Unfortunately, MSC and PLS generally deal with only two types of data sets at the same time, while we may need to analyze more than two types of data sets. Moreover, it is not straightforward to extend MSC to the group level for combining results across subjects. Also, PLS can have the information mixing problem since only the variations in one data set are used to predict the other data set. To address these concerns, we propose a joint multimodal analysis framework for corticomuscular coupling analysis. The proposed framework models multiple data spaces simultaneously in a multidirectional fashion. Furthermore, to address the inter-subject variability concern in real-world medical applications, we extend the proposed framework from the individual subject level to the group level to obtain common corticomuscular coupling patterns across subjects. We apply the proposed framework to concurrent EEG, EMG and behavior data collected in a Parkinson's disease (PD) study. The results reveal several highly correlated temporal patterns among the three types of signals and their corresponding spatial activation patterns. In PD subjects, there are enhanced connections between occipital region and other regions, which is consistent with the previous medical finding. The proposed framework is a promising technique for performing multi-subject and multi-modal data analysis.