摘要
现有时序异常检测方法存在计算效率低和可解释性差的问题。考虑到Transformer模型在自然语言处理任务中表现出并行效率高且能够跨距离提取关系的优势,提出基于Transformer的掩膜时序建模方法。建立时序数据的并行无方向模型,并使用掩膜机制重建当前时间步,从而实现整段序列的重建。在存储系统数据集和NASA航天器数据集上的实验结果表明,与基于长短期记忆网络模型的检测方法相比,该方法可节约80.7%的计算时间,Range-based指标的F1得分达到0.582,并且其通过可视化关系矩阵可准确反映人为指令与异常的关系。
Existing Anomaly Detection(AD)methods for time series are faced with inefficient computation and poor interpretability.As the Transformer model shows the high parallel efficiency and the ability to extract relations regardless of distance in Natural Language Processing(NLP) tasks,this paper proposes a Transformer-based method,Masked Time Series Modeling(MTSM).The parallel model with no direction of time series data is constructed,and the mask strategy is used for the reconstruction of the current timestep and then the whole sequence.Experimental results on the storage system dataset and NASA spacecraft dataset show that the proposed method saves about80.7% time cost compared with the detection method based on Long-Short Term Memory(LSTM)model,and achieves0.582 in F1 score for Range-based index.Moreover,it can visualize the relation matrix to reflect the relation between anomalies and human instructions accurately.
作者
孟恒宇
李元祥
MENG Hengyu;LI Yuanxiang(School of Aeronautics and Astronautics,Shanghai Jiao Tong University,Shanghai 200240,China)
出处
《计算机工程》
CAS
CSCD
北大核心
2021年第2期69-76,共8页
Computer Engineering
基金
国家自然科学基金“海洋环境动力学和数值模拟”(U1406404)。
关键词
时序数据
注意力机制
异常检测
关系提取
自动编码器
time series data
attention mechanism
Anomaly Detection(AD)
relation extraction
Auto-Encoder(AE)