This graphical abstract denotes the overall procedure of the differentially private glass-box approach for the bank failure prediction. By applying differential privacy w...
Abstract:
Predicting bank failures is a critical task requiring balancing the need for model explainability with the necessity of preserving data privacy. Traditional machine learn...Show MoreMetadata
Abstract:
Predicting bank failures is a critical task requiring balancing the need for model explainability with the necessity of preserving data privacy. Traditional machine learning models often lack transparency, which poses challenges for stakeholders who need to understand the factors leading to predictions. In this study, we employ differentially private glass-box models, namely Explainable Boosting Machine (EBM) and Neural Additive Models (NAM), to address these issues. We analyzed data from 21,243 American banks spanning from 1969 to 2021, focusing on key financial ratios. By applying Differential Privacy (DP) to these models, we aimed to protect sensitive financial data while evaluating the trade-offs between privacy, accuracy, and explainability. Our main findings are as follows: 1) In the absence of privacy constraints, the models consistently identified Asset Turnover, Total Debt / Invested Capital, and ROE ratios as the most influential factors in predicting bank failure, in that order; 2) When the privacy budget \epsilon \leq 1 , only EBM maintained significant performance; 3) The reduction in explainability due to privacy protection was more pronounced for variables with initially lower explanatory power, while Asset Turnover retained its explanatory power even at \epsilon = 0.01 . These findings provide valuable insights for banks, policymakers, and investors, suggesting that glass-box models can offer a promising solution for reliable and explainable bank failure prediction under privacy constraints.
This graphical abstract denotes the overall procedure of the differentially private glass-box approach for the bank failure prediction. By applying differential privacy w...
Published in: IEEE Access ( Volume: 13)