Loading [MathJax]/extensions/MathMenu.js
Transformers in Time Series Forecasting: A Brief Transfer Learning Performance Analysis | IEEE Conference Publication | IEEE Xplore

Transformers in Time Series Forecasting: A Brief Transfer Learning Performance Analysis


Abstract:

Transformer models have risen to the challenge of delivering high prediction capacity for long-term time-series forecasting. Several transformer architectures designed fo...Show More

Abstract:

Transformer models have risen to the challenge of delivering high prediction capacity for long-term time-series forecasting. Several transformer architectures designed for time series forecasting are being developed. However, the phenomenon of insufficient amount of training data in certain domains is a constant challenge in deep learning. Therefore there is a pressing need to develop transformer-based transfer learning techniques that can culminate in the design of Transformer-based Time Series Pre-Trained Models (TS-PTMs) that can be used in this situation. With the aim to investigate how pretrained Transformers can be effectively utilized and fine-tuned to improve the accuracy and efficiency of time series forecasting, this paper presents a brief Transfer Learning Performance Analysis considering 4 models: the Conformer, the Vanilla Transformer, the LSTM, and the MLP. Three variant TL techniques have been explored and empirically analyzed: total model freezing, feature extraction, and fine-tuning. The initial experimental results show that, given the complexity of transformer models, various fine-tuning and feature extraction techniques need to be developed for transfer learning to reach maturity in this field.
Date of Conference: 19-22 July 2023
Date Added to IEEE Xplore: 25 December 2023
ISBN Information:
Conference Location: Rhodes (Rodos) Island, Greece

Contact IEEE to Subscribe

References

References is not available for this document.