Attention multiple instance learning with Transformer aggregation for breast cancer whole slide image classification | IEEE Conference Publication | IEEE Xplore

Attention multiple instance learning with Transformer aggregation for breast cancer whole slide image classification


Abstract:

Recently, attention-based multiple instance learning (MIL) methods have received more concentration in histopathology whole slide image (WSI) applications. However, exist...Show More

Abstract:

Recently, attention-based multiple instance learning (MIL) methods have received more concentration in histopathology whole slide image (WSI) applications. However, existing attention-based MIL methods rarely consider the cross-channel information interaction of pathology images when identifying discriminant patches. Additionally, they also have limitations on capturing the correlation between different discriminant instances for the bag-level classification. To address these challenges, we present a novel attention-based MIL model (AMIL-Trans) for breast cancer WSI classification. AMIL-Trans first embeds the efficient channel attention to realize the cross-channel interaction of pathology images, thus computing more robust features for instance selection without introducing too much computation cost. Then, it leverages vision Transformer encoder to directly aggregate selected instance features for better bag-level prediction, which effectively considers the correlation between different discriminant instances. Experiment results illustrate that AMIL-Trans respectively achieves its optimal AUC of 94.27% and 84.22% on the Camelyon-16 dataset and MSK external validation dataset, demonstrating the competitive performance compared with state-of-the-art MIL methods on breast cancer WSI classification task. The code will be available at https://github.con CunqiaoHou/AMIL-Trans.
Date of Conference: 06-08 December 2022
Date Added to IEEE Xplore: 02 January 2023
ISBN Information:
Conference Location: Las Vegas, NV, USA

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.