Transformer-based Multi-target Regression on Electronic Health Records for Primordial Prevention of Cardiovascular Disease | IEEE Conference Publication | IEEE Xplore

Transformer-based Multi-target Regression on Electronic Health Records for Primordial Prevention of Cardiovascular Disease


Abstract:

Machine learning algorithms have been widely used to capture the static and temporal patterns within electronic health records (EHRs). While many studies focus on the (pr...Show More

Abstract:

Machine learning algorithms have been widely used to capture the static and temporal patterns within electronic health records (EHRs). While many studies focus on the (primary) prevention of diseases, primordial prevention (preventing the factors that are known to increase the risk of a disease occurring) is still widely under-investigated. In this study, we propose a multi-target regression model leveraging transformers to learn the bidirectional representations of EHR data and predict the future values of 11 major modifiable risk factors of cardiovascular disease (CVD). Inspired by the proven results of pre-training in natural language processing studies, we apply the same principles on EHR data, dividing the training of our model into two phases: pre-training and fine-tuning. We u se t he fine-tuned transformer model in a “multi-target regression” theme. Following this theme, we combine the 11 disjoint prediction tasks by adding shared and target-specific l ayers t o t he m odel and jointly train the entire model. We evaluate the performance of our proposed method on a large publicly available EHR dataset. Through various experiments, we demonstrate that the proposed method obtains a significant improvement (12.6% MAE on average across all 11 different outputs) over the baselines.
Date of Conference: 09-12 December 2021
Date Added to IEEE Xplore: 14 January 2022
ISBN Information:
PubMed ID: 36684475
Conference Location: Houston, TX, USA

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.