Abstract:
This paper introduces a novel optimized hybrid model combining Long Short-Term Memory (LSTM) and Transformer deep learning architectures designed for power load forecasti...Show MoreMetadata
Abstract:
This paper introduces a novel optimized hybrid model combining Long Short-Term Memory (LSTM) and Transformer deep learning architectures designed for power load forecasting. It leverages the strengths of both LSTM and Transformer models, ensuring more accurate and reliable forecasts of power consumption while considering geographic factors, user behavioral factors, and time constraints for the training time. The model is modified to forecast the total power load for consecutive future time instances rather than the next time instance. We have tested the models using residential power consumption data, and the findings reveal that the optimized hybrid model consistently outperforms existing methods.
Published in: IEEE Transactions on Smart Grid ( Volume: 16, Issue: 3, May 2025)