Robustifying Routers Against Input Perturbations for Sparse Mixture-of-Experts Vision Transformers | IEEE Journals & Magazine | IEEE Xplore

Robustifying Routers Against Input Perturbations for Sparse Mixture-of-Experts Vision Transformers


Abstract:

Mixture of experts with a sparse expert selection rule has been gaining much attention recently because of its scalability without compromising inference time. However, u...Show More
Topic: ICASSP 2025

Abstract:

Mixture of experts with a sparse expert selection rule has been gaining much attention recently because of its scalability without compromising inference time. However, unlike standard neural networks, sparse mixture-of-experts models inherently exhibit discontinuities in the output space, which may impede the acquisition of appropriate invariance to the input perturbations, leading to a deterioration of model performance for tasks such as classification. To address this issue, we propose Pairwise Router Consistency (PRC) that effectively penalizes the discontinuities occurring under natural deformations of input images. With the supervised loss, the use of PRC loss empirically improves classification accuracy on ImageNet-1 K, CIFAR-10, and CIFAR-100 datasets, compared to a baseline method. Notably, our method with 1-expert selection slightly outperforms the baseline method using 2-expert selection. We also confirmed that models trained with our method experience discontinuous changes less frequently under input perturbations.
Topic: ICASSP 2025
Published in: IEEE Open Journal of Signal Processing ( Volume: 6)
Page(s): 276 - 283
Date of Publication: 30 January 2025
Electronic ISSN: 2644-1322

Funding Agency:


References

References is not available for this document.