This paper proposes a novel nonrigid inter-subject multichannel image registration method which combines information from different modalities/channels to produce a unified joint registration. Multichannel images are created using co-registered multimodality images of the same subject to utilize information across modalities comprehensively. Contrary to the existing methods which combine the information at the image/intensity level, the proposed method uses feature-level information fusion method to spatio-adaptively combine the complementary information from different modalities that characterize different tissue types, through Gabor wavelets transformation and Independent Component Analysis (ICA), to produce a robust inter-subject registration. Experiments on both simulated and real multichannel images illustrate the applicability and robustness of the proposed registration method that combines information across modalities. This inter-subject registration is expected to pave the way for subsequent unified population-based multichannel studies.