Concatenation-Informer: Pre-Distilling and Concatenation Improve Efficiency and Accuracy
J Yin, M Chen, C Zhang, M Zhang… - … on Automation and …, 2023 - ieeexplore.ieee.org
J Yin, M Chen, C Zhang, M Zhang, T Xue, T Zhang
2023 28th International Conference on Automation and Computing (ICAC), 2023•ieeexplore.ieee.orgTime series widely exist in the real world, and a large part of them are long time series, such
as weather information records and industrial production information records. The inherent
long-term data dependence of long-time series has extremely high requirements on the
feature extraction ability of the model. The sequence length of long time series also directly
causes high computational cost, which requires the model to be more efficient. This paper
proposes Concatenation-Informer containing a Pre-distilling operation and a Concatenation …
as weather information records and industrial production information records. The inherent
long-term data dependence of long-time series has extremely high requirements on the
feature extraction ability of the model. The sequence length of long time series also directly
causes high computational cost, which requires the model to be more efficient. This paper
proposes Concatenation-Informer containing a Pre-distilling operation and a Concatenation …
Time series widely exist in the real world, and a large part of them are long time series, such as weather information records and industrial production information records. The inherent long-term data dependence of long-time series has extremely high requirements on the feature extraction ability of the model. The sequence length of long time series also directly causes high computational cost, which requires the model to be more efficient. This paper proposes Concatenation-Informer containing a Pre-distilling operation and a Concatenation-Attention operation to predict long time series. The pre-distilling operation reduces the length of the series and effectively extracts context-related features. The Concatenation-Attention operation concatenates the attention mechanism's input and output to improve the efficiency of parameters. The total space complexity of the Concatenation-Informer is less than the complexity and usage of the Informer.
ieeexplore.ieee.org