Abstract:
Advances in machine learning and deep learning have assisted progress in stock market predictions, which have presented unique opportunities for investors and traders to ...Show MoreMetadata
Abstract:
Advances in machine learning and deep learning have assisted progress in stock market predictions, which have presented unique opportunities for investors and traders to benefit from predictions. However, understanding how these models work is crucial to trusting their predictions. This research applies three prominent Explainable AI (XAI) techniques - SHAP, LIME, and Permutation Importance - to three distinct forecasting models: Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and Decision Tree. This work fills a gap in the literature by comparing XAI methods across a combination of “black-box” and “white-box” models. Two key research questions guide this research: the performance of these XAI methods across the chosen models and the consistency between global and local explanations across these models. Findings reveal interesting model-specific behaviors, like CNN's emphasis on slightly older data and LSTM's focus on immediate past data. In the S&P 500 dataset context, features such as ‘Close’ and ‘Adj Close’ prices emerged as significant across models, while ‘Volume’ remained insignificant. This study offers a broader perspective on applying XAI in financial time series forecasting and helps enhance trust and comprehension among stakeholders relying on these model predictions.
Date of Conference: 04-06 December 2023
Date Added to IEEE Xplore: 05 April 2024
ISBN Information: