Abstract:
Industrial predictive modeling plays an important role in process control and optimization. Industrial process data arisen in the real-world applications often involves n...Show MoreMetadata
Abstract:
Industrial predictive modeling plays an important role in process control and optimization. Industrial process data arisen in the real-world applications often involves nonlinear and temporal characters, which are two main challenges for accurate industrial predictive modeling. While the previous transformer-based industrial predictive models only considered the temporal information of the industrial time-series data, however, the different importance of the process variables is generally ignored. In this article, we propose a novel dual cross-attention-based transformer (DCAFormer) to capture both the cross-time dependencies and the cross-variable dependencies in parallel for better predictability. Specifically, the proposed DCAFormer is composed of a cross-time self-attention layer and a cross-variable self-attention layer. The cross-variable self-attention is developed to capture multivariate correlations by inverting the input time series into variate tokens. The destationary cross-time self-attention is employed to extract the intrinsic nonstationary information into the temporal dependencies from the time-series data. The comparative as well as the ablation experiments are conducted on the real-world aluminum electrolysis process. The experimental results show that DCAFormer achieves better prediction performance than other competitive transformer models.
Published in: IEEE Transactions on Instrumentation and Measurement ( Volume: 73)