Abstract:
This paper proposes a novel optimization principle and its implementation for unsupervised anomaly detection in sound (ADS) using an autoencoder (AE). The goal of the uns...Show MoreMetadata
Abstract:
This paper proposes a novel optimization principle and its implementation for unsupervised anomaly detection in sound (ADS) using an autoencoder (AE). The goal of the unsupervised-ADS is to detect unknown anomalous sounds without training data of anomalous sounds. The use of an AE as a normal model is a state-of-the-art technique for the unsupervised-ADS. To decrease the false positive rate (FPR), the AE is trained to minimize the reconstruction error of normal sounds, and the anomaly score is calculated as the reconstruction error of the observed sound. Unfortunately, since this training procedure does not take into account the anomaly score for anomalous sounds, the true positive rate (TPR) does not necessarily increase. In this study, we define an objective function based on the Neyman-Pearson lemma by considering the ADS as a statistical hypothesis test. The proposed objective function trains the AE to maximize the TPR under an arbitrary low FPR condition. To calculate the TPR in the objective function, we consider that the set of anomalous sounds is the complementary set of normal sounds and simulate anomalous sounds by using a rejection sampling algorithm. Through experiments using synthetic data, we found that the proposed method improved the performance measures of the ADS under low FPR conditions. In addition, we confirmed that the proposed method could detect anomalous sounds in real environments.
Published in: IEEE/ACM Transactions on Audio, Speech, and Language Processing ( Volume: 27, Issue: 1, January 2019)
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Neyman Pearson Lemma ,
- Training Data ,
- Objective Function ,
- False Positive Rate ,
- Training Procedure ,
- Anomaly Detection ,
- Bellman Equation ,
- Statistical Hypothesis Testing ,
- Rejection Sampling ,
- Anomaly Score ,
- Set Of Sounds ,
- Neural Network ,
- Data Normalization ,
- Support Vector Machine ,
- Deep Neural Network ,
- Expectation Maximization ,
- Long Short-term Memory ,
- Experimental Verification ,
- Input Vector ,
- Kullback-Leibler ,
- Gaussian Mixture Model ,
- Variational Autoencoder ,
- Latent Vector ,
- Latent Space ,
- Pseudo-random Number Generator ,
- Anomalous Data ,
- Generative Adversarial Networks ,
- Water Pump ,
- Discrete Fourier Transform ,
- Trade-off Relationship
- Author Keywords
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Neyman Pearson Lemma ,
- Training Data ,
- Objective Function ,
- False Positive Rate ,
- Training Procedure ,
- Anomaly Detection ,
- Bellman Equation ,
- Statistical Hypothesis Testing ,
- Rejection Sampling ,
- Anomaly Score ,
- Set Of Sounds ,
- Neural Network ,
- Data Normalization ,
- Support Vector Machine ,
- Deep Neural Network ,
- Expectation Maximization ,
- Long Short-term Memory ,
- Experimental Verification ,
- Input Vector ,
- Kullback-Leibler ,
- Gaussian Mixture Model ,
- Variational Autoencoder ,
- Latent Vector ,
- Latent Space ,
- Pseudo-random Number Generator ,
- Anomalous Data ,
- Generative Adversarial Networks ,
- Water Pump ,
- Discrete Fourier Transform ,
- Trade-off Relationship
- Author Keywords