Overview of the proposed AnoFormer. We propose a new Transformer-based GAN framework having a generator and a critic. To effectively encode the distribution of normal dat...
Abstract:
Time series anomaly detection is a task that determines whether an unseen signal is normal or abnormal, and it is a crucial function in various real-world applications. T...Show MoreMetadata
Abstract:
Time series anomaly detection is a task that determines whether an unseen signal is normal or abnormal, and it is a crucial function in various real-world applications. Typical approach is to learn normal data representation using generative models, like Generative Adversarial Network (GAN), to discriminate between normal and abnormal signals. Recently, a few studies actively adopt Transformer to model time series data, but there is no pure Transformer-based GAN framework for time series anomaly detection. As a pioneer work, we propose a new pure Transformer-based GAN framework, called AnoFormer, and its effective training strategy for better representation learning. Specifically, we improve the detection ability of our model by introducing two-step masking strategies. The first step is Random masking: we design a random mask pool to hide parts of the signal randomly. This allows our model to learn the representation of normal data. The second step is Exclusive and Entropy-based Re-masking: we propose a novel refinement step to provide feedback to accurately model the exclusive and uncertain parts in the first step. We empirically demonstrate the effectiveness of re-masking step that generates more normal-like signals robustly. Extensive experiments on various datasets show that AnoFormer significantly outperforms the state-of-the-art methods in time series anomaly detection.
Overview of the proposed AnoFormer. We propose a new Transformer-based GAN framework having a generator and a critic. To effectively encode the distribution of normal dat...
Published in: IEEE Access ( Volume: 11)