Abstract:
High-quality X-rays are now available to diagnose lung diseases with the help of radiologists. However, the diagnostic process is time consuming and depends on specialist...Show MoreMetadata
Abstract:
High-quality X-rays are now available to diagnose lung diseases with the help of radiologists. However, the diagnostic process is time consuming and depends on specialist availability in medical institutions. Patient information may include chest X-rays of varying quality, medical test results, doctors’ notes and prescriptions, and medication details, among others. In this study, we present a model for classifying pulmonary diseases using multimodal data from patient clinical studies and radiographic images. Various methods were used to generate artificial samples for both images and tabular data on the laboratory study results during data preparation. We also proposed a method for establishing a correspondence between the generated modals. The late fusion architecture of the proposed multimodal model was implemented. We conducted experiments on pulmonary data-set with two modalities. Results shows that an increase in accuracy and other parameters were observed for multimodal data fusion using our model in comparison with image only modality and clinical data only modality. It strengthen the fact that multimodality provides more insight for the fusion model to learn and provide a more precise diagnosis than single modality.
Date of Conference: 09-10 February 2024
Date Added to IEEE Xplore: 01 April 2024
ISBN Information: